Hello. I have slightly thought how to make wget more better, possibly. We would need a scripting system so that features can be programmed more easily. One way how to incorporate the scripting to wget would be to re-write wget as a data flow system. Much similar way than OpenGL (www.opengl.org) is a data flow for graphics. The scripts would be executed in specific places in the data flow graph. Much similar way than vertex and fragment programs are executed in OpenGL in the specific places of the graph.
So, the urls would enter the data flow and the routines in the graph would do something to them. I don't know yet what kind of graph we would have but here is a simple one: url input --> url processing --> site exclusion --> dir path exclusion --> get file --> Then the graph goes deeper in parsing the html. Example: I could add a script just after the "get file". The script would uncompress the downloaded file to a new file and change the local_filename variable to the name of the new file. The graph would make it possible to use different granularity. Details can be added by splitting the large graph nodes later. Regards, Juhana