Hi Hank, Sounds exciting! TL;DR try it via benchmark.
My gut feel says that overall performance and cost (storage, cpu time etc) are likely to trump any pure couchdb related concerns. So I'd seriously benchmark this, nothing makes for better advice than a real test scenario to back it up done on real world hardware. Some considerations; Every update you need to make represents another version of that ddoc. That will trigger a view group rebuild for any code changes (show/list etc) over the whole db. This is probably one of the most common reasons for splitting up ddocs (and databases). A static site with nginx or <preferred_server> will always beat a dynamic CouchDB setup. Using nginx to cache content that hasn't changed is a good compromise. There's apache and nginx examples on the wiki that leverages couchdb etag support, but is that fast enough for the hardware & infrastructure you budgeted? If you're simply talking about having a lot of attachments in that ddoc (couchapp style) then is this for end-user convenience or developer convenience? Updating a single .css file and then needing to re-push an entire 100MB+ ddoc doesn't seem effective to me. I'd ask whether you can achieve the same outcome using some pre-processing tool (grunt, whatever) to keep what appears to be developer convenience, and use the results from the benchmarking to decide whether your end-user experience is going to be appropriate & cost-effective instead. If your ddoc + attachments is GB in size, I'd be worried, but 100MBs should not be a constraint. http://docs.couchdb.org/en/latest/config/http.html?highlight=secure_rewrites#httpd/secure_rewrites may give you the developer flexibility to share some of the web components across websites & ddocs without needing to have to duplicate all the code -- you can put all the core files into a single ddoc hosted in a common db that can be accessed from a single origin. Finally, there are a lot of other factors that may influence your choices, security, backups & version control being the big ones. On 20 January 2014 13:46, Hank Knight <hknight...@gmail.com> wrote: > Thanks, but I am looking for a direct answer. Could extremely large > design documents cause performance problems? > > On Sun, Jan 19, 2014 at 7:20 PM, Andy Wenk <a...@nms.de> wrote: >> On 19 January 2014 23:11, Hank Knight <hknight...@gmail.com> wrote: >> >>> I would like to use CouchDB to power a large website containing >>> hundreds of pages. The pages will be dynamically created from >>> thousands of documents. Some pages will be powered by List Functions >>> and the other pages will be created by Show functions. >>> >>> There are many advantages of using a single design document to power >>> ALL of the pages. For example, I could access could access common >>> JavaScript modules / functions and I could manage common areas such as >>> the header and footer from a single place. >>> >>> Are there any drawbacks to storing all pages inside a singe design >>> document? >>> >> >> not answering your question directly, but this sounds like you want to >> check out these frameworks for doing the job: >> >> http://couchapp.org/page/index >> https://github.com/benoitc/erica >> http://kan.so/ >> >> Cheers >> >> Andy >> >> -- >> Andy Wenk >> Hamburg - Germany >> RockIt! >> >> http://www.couchdb-buch.de >> http://www.pg-praxisbuch.de >> >> GPG fingerprint: C044 8322 9E12 1483 4FEC 9452 B65D 6BE3 9ED3 9588 >> >> https://people.apache.org/keys/committer/andywenk.asc