pouchdb-community/relational-pouch

Handling ECONNRESET

Closed this issue · 4 comments

Hi all.

I'm trying to use a basic find. This works with no trouble:

db.rel.find("text").then(res => console.log(`length: ${res.texts.length}`));

After a minute or so, I get "length: 18" posted to the console. However, I have some slightly larger models:

db.rel.find("entry", {limit: 30000})
  .then(res => console.log(`length: ${res.entries.length}`));

(there are, as is verified via db.allDocs, about 13k entry documents)

This always fails, with an ECONNRESET, and I don't know what to do:

FetchError: request to http://localhost:5984/db/_find failed, reason: connect ECONNRESET 127.0.0.1:5984
    at ClientRequest.<anonymous> (/.../node_modules/node-fetch/lib/index.js:1444:11)

If I try it with a local pouchDB instead of the server, I get a JavaScript heap out of memory error, even though the db is ~17mb.

I mention this since I don't get ECONNRESET errors (or any errors at all) when doing, for comparison:

db.find({
  selector: {
    _id: {
      $regex: `^entry_`
    }
  },
  limit: 30000
}).then(res => console.log(`length: ${res.docs.length}`));

Thanks. I incorporated that, however, and still am plagued by ECONNRESET errors. This is all part of a script to populate a database. Originally I was using a combination of db.rel.makeDocID and db.bulkDocs to create the original documents, and then I wired their relations together w/ db.allDocs and db.bulkDocs. That all worked well, but, as mentioned above, db.rel.find routinely failed on the larger models.

I thought perhaps there was something wrong with how I was creating the documents, so I've changed the seeding. Now it executes a series of Promise.alls on an array of db.rel.saves for each of the five models. But even that is failing on the biggest (13k docs) model.

My goal is to seed and check the database for consistency reproducibly. If there are better ways than the two I've been trying, I'd like to know.

Thanks again.

Do you have test data we can replicate to a test db on my.couchcluster.com for me to try. My test works:

var PouchDB = require('pouchdb');
PouchDB.plugin(require('relational-pouch'));
PouchDB.plugin(require('pouchdb-find'));

var db = new PouchDB('https://my.couchcluster.com/bloggr');
db.setSchema([
  {singular: 'author', plural: 'authors', relations: {posts: {hasMany: {type: 'post', options: {queryInverse: 'post'}}}}},
  {singular: 'post', plural: 'posts', relations: {author: {belongsTo: 'author'}}}
]);
db.rel.find("post", {limit: 30000}).then(res => {
  console.log(`length: ${res.posts.length}`);
});

I think I solved it. In lieu of db.bulkDocs or Promise.all([db.rel.save, db.rel.save...]), I'm doing an await inside a loop:

for(const post of posts){
  await db.rel.save("post", post);
}

And that solves slamming the server w/ thousands of Promises all at once. Furthermore, now db.rel.find() works correctly, too, even when I'm doing five finds at once in a Promise.all() array.

So I think we're good here. Thanks for your help, again, and I am slowly understanding promises better.