Hello http://arado.sf.net is a cool URL Database in c++/Qt made with (currently) SQLite. You can add your Bookmarks, URLs, RSS-Feeds into that Database, which is as well networkable and syncronized in a p2p network.
Now it is time to add an c++ Crawler to crawl all URLs (for depth of one or two) all the user opens from the URL Database. E.g. Browse an RSS feed and Crawl the page, add the new URLs from this page again to the SQLIte database. Is there anyone on the List, who made Libcurl work with Qt or SQLite or can share experience with the implementation of a very simple function like - get the URL of the browsed website. - crawl all hyperlinked URLs on that specific website for depth one or two. - add the new found URLs to the SQLite database with Title (from webpage title) plus keywords (first step from website meta description) Thanks for a feedback, if thatis possible with Libcurl and if someone has experience with a simple stripped function like this in c++ and minimal gui elements in Qt. Ideally the "Browse URL" button is added with a function " Browse and Crawl" Feedback is welcome Thanks Regards Randolph ------------------------------------------------------------------- List admin: http://cool.haxx.se/list/listinfo/curl-library Etiquette: http://curl.haxx.se/mail/etiquette.html
