It depends on what the fields are for your web server logging and the logging format. Various options can be set to capture various things.
Item 1
The first part is the IP address of the client who came to your site
It look for your robots.txt which is what a search bot often looks for to determine if it is allowed to crawl your site. It was returned a HTTP 404 error meaning the page was not found. The last part appears to be the "user agent" which seems to be a crawler. Essentially, something appears to be indexing your site. If you don't want indexing a robots.txt can be added with specific options telling robots not to index, crawl.
Second set
client came in from that IP address, got an HTTP 200 response, which is a successfully returned page. I "think" it took 13724 milliseconds to return the response, or it could be bytes in size of the page. Again, it depends on your log format and what you are capturing. The last part is the user agent, Windows NT 6.1 == Windows 7, French version of Firefox.
see
http://www.useragentstring.com/Firef...9_id_16360.php
As you stated you are very interested in knowing more, perhaps you have the ability to change your log format to get more information. The link below is for a Microsoft IIS web server, but it details the fields of the W3C format which is not MS specific. The link shows what can be logged. Depending on your webserver, the steps to include this vary, but at least on Microsoft IIS, it is a trivial set of steps.
http://www.microsoft.com/technet/pro....mspx?mfr=true