I write a lot of C# and it has a really useful way to query large collections of data, called LINQ.
Although JavaScript arrays have querying capabilities through map
, reduce
, filter
but the limitation is they process the entire array before going to the next step, meaning if you do a map
then a filter
the whole collection goes through map process before it starts filtering. This can be a problem with large data sets, it can take a while.
The advantage of LINQ is that it uses lazy execution, each item goes through the whole pipeline before the next item is processed. So if you have a large data set you can easily process subsets of the data and break early.
To do this it leverages the ES6 Iterators to yield
each value.
var items = [1, 2, 3, 4, 5, 6].asEnumerable();
for (let item in items.where(x => x % 2).select(x => x + 1))
console.log(item);
//output will be 2, 4, 6
The following LINQ methods are implemented:
aggregate
all
any
asEnumerable
average
concat
contains
count
first
,firstOrDefault
range
repeat
select
, aliased tomap
selectMany
single
,singleOrDefault
toArray
where
, aliased tofilter
The full list of what LINQ in .NET does can be found here.
MIT