Hi! I've been looking all over for a solution to a problem I can't even name with any certainty. Here's what I want to do:
1. Often online manuals for software (or other subjects) will be broken up into many separate html pages, usually with a table of contents page with links to the other pages in the document. Additionally, each page will also sometimes have links to sub-pages, and so on.
2. I suppose the idea is to "chunk" the information into usable pieces, each one with a separate html page, instead of putting up a single but very long page with links at the top anchors distributed throughout as needed.
3. I prefer the single page approach, especially if I want to print out the damn thing and put it in a binder for easy (in-the-hand-not-on-the-screen) reference. I find the chunking of the other approach very frustrating.
4. So, I'm looking for a way to crawl through the document, putting all the chunks together into a single document file that I can print out or at least easily scroll up and down in. Oh, using cut-and-paste to manually build a single file is tedious and fraught with all kinds of frustrations. In other words, it sucks!
5. I don't even really know what to call this process, but it seems like a variation of web crawling, but limited to a single chunked document and putting it together into a properly-ordered whole.
Do any of you have thoughts or ideas about this? Has anyone already done this? Is there even existing software/code available? Am I trying to pee up a proverbial rope?
Sorry, but you are not allowed to view signatures , please Register or Login