Check this program:
#include map
#include string
#include iostream
#include algorithm
#include iterator
using namespace std;
class Range : public pairconst char*, const char* {
public:
size_t size() const { return second - first; }
Range(const char* f=0, const char * s=0)
En, it is the idea. You assume each keyword is a single character and
you use a map to hash key words and their counts. Each time you extend
the range to right hand side, you may increase the counts of some
found key words and each time you shrink the range from the left hand
side, you decrease
I have a rough idea about what you are concerning. Yes, DFS is usually
implemented as recursion and there must be stack overflow problem for
any algorithms using stacks to store search paths. But I never think
about crawling the Web as an application of DFS. But it is, I have to
say.
To me, if
By the way, if you implement DFS with non-recursion style, that means,
you implement stack by yourself. Generally you can create the stack in
heap memory space rather than in stack memory space. This will give
you larger capability of DFS. But for tasks like crawling the Web,
this still can