Your restful APIs with zero coding in less than 30 seconds...
I spend so many time googling for a easy way to have a free full feature REST API without finding what i want, so i decided to create restlastic to get the ideas from json-server, the modularity of sails.js, the scalability and features of elasticsearch and kibana.
See the 5min video presentation
GET /restlastic?price=free
{
features : {
create : "Your Restful APIs in less than 30 sec",
code : "0 line of code",
enjoy : "Pagination, search, filters, fulltext,...",
generate : "Instantly swagger, postman, mocha",
dashboard : "Realtime graph with kibana",
scalable : "To infinity and beyond",
customise : "based on node.js, sails.js, elasticsearch"
},
tagline: "You Know, for API"
}
Create a products
API (/data/v1/products)
curl -iX POST \
-H 'Content-Type: application/json; charset=utf-8' \
-d '{ "name":"banana", "status":"available", "price":12 }' \
'http://api.restlastic.com/data/v1/products'
Now you got
- GET, POST , PUT, PATCH, DELETE actions available
- Pagination, fulltext search, filters operators, fields filter, subressources, hierarchical json...
- Automatically managed fields : id, creation_date and modification_date, etag
- Swagger documentation template with data example in swagger editor
- Postman request template
- Access to kibana for dashboarding your data
- Modular and scalable on premise solution with restlastic, sails.js, elasticsearch, kibana
Also when doing requests, its good to know that
- Your request body JSON should be object enclosed, just like the GET output. (for example
{"name": "Foobar"}
) - Id values are not mutable. Any
id
value in the body that differ from id in url will send an error - A POST, PUT or PATCH request should include a
Content-Type: application/json
header to use the JSON in the request body. Otherwise it will result in a 200 OK but without changes being made to the data.
You can install Restlastic with docker-compose
, npm
(if elasticsearch already installed) or use cloud demo beta
It will install restlastic, elasticsearch and kibana
$> git clone https://github.com/restlatic/restlatic.git
$> cd restlastic
$> docker-compose up -d
Then use http://localhost (api), http://localhost:9200 (elasticsearch), http://localhost:5601 (kibana)
You must have elasticsearch already installed
$> npm install restlastic
$> npm start
Then use http://localhost:1337
/!\ All data are deleted each day
Use api.restlatic.com, elastic.restlatic.com, kibana.restlatic.com
Based on the previous POST command, here are all the default routes. You can also add your own routes using sails.js routes.
GET sample/v1/products
GET sample/v1/products/search?q=
GET sample/v1/products/:id
POST sample/v1/products
POST sample/v1/products/:id
PATCH sample/v1/products/:id
PUT sample/v1/products/:id
DELETE sample/v1/products/:id
GET sample/v1/products/swagger.yaml
GET sample/v1/products/postman.json
Add fields=
to retrieve only some fields (Use .
to access deep properties)
GET sample/v1/products/:id?fields=id,name,creation_date
GET sample/v1/products?fields=id,name
GET sample/v1/products?fields=adress
GET sample/v1/products?fields=adress.locality
Use .
to access deep properties
GET sample/v1/products?title=elastifull&author=clodio
GET sample/v1/products?id=1,2
GET sample/v1/products?id=1&id=2
GET sample/v1/products?author.name=clodio
Add start_index
and/or count
(pagination
and total_results
will be included in response with next
and previous
link)
GET sample/v1/products?start_index=20
GET sample/v1/products?start_index=20&count=3
{
"data":[...],
"paging": {
"total_results":30,
"prev": "http://api.restlastic.com/sample/v1/products?start_index=17&count=3",
"next": "http://api.restlastic.com/sample/v1/products?start_index=23&count=3",
}
}
Add _gte
,_lte
,_gt
,_lt
for getting a range
GET sample/v1/products?price_gte=10&price_lte=20
GET sample/v1/products?creation_date_gte=now-1d/d
GET sample/v1/products?creation_date_gte=now-1d
GET sample/v1/products?creation_date_gte=2014-06-18T23:59:59Z
Add _ne
to exclude a value
GET sample/v1/products?price_ne=20
add _exists=true
to find records with the field, _exists=false
to find records without fields
GET sample/v1/products?address.locality_exists=true
GET sample/v1/products?address.locality_exists=false
Add _like
to filter using like
GET sample/v1/products?title_like=server*
Add _prefix
to filter with prefix
GET sample/v1/products?name_prefix=cl
Add _regex
to filter with RegExp
GET sample/v1/products?name_regex=clo.?dio
Add _fuzzy
to filter with fuzzy search
GET sample/v1/products?votes_fuzzy=2 --> votes : 1,2,3
GET sample/v1/products?name_fuzzy=cladio --> name : clodio
GET sample/v1/products?votes_fuzzy=cldio --> name : clodio
Add /search?q=
to do a full search on all fields
GET sample/v1/products/search?q=*c
Ressources can be linked to others with a _
, in this case subRessource
will have a products_id
field with the value :id
GET sample/v1/products/:id/_subRessource
GET sample/v1/products/:id/_subRessource/search?q=
GET sample/v1/products/:id/_subRessource/:subRessource_id
POST sample/v1/products/:id/_subRessource
POST sample/v1/products/:id/_subRessource/:subRessource_id
PATCH sample/v1/products/:id/_subRessource/:subRessource_id
PUT sample/v1/products/:id/_subRessource/:subRessource_id
DELETE sample/v1/products/:id/_subRessource/:subRessource_id
creation_date
and modification_date
are automatically managed with RFC3369
{
"creation_date": "2014-06-18T23:59:59Z",
"modification_date": "2014-06-18T23:59:59Z",
}
You can use now
, now-1d
, now+1d/d
,... when searching/filtering (see elastic.co)
GET sample/v1/products?creation_date_gte=now-1d
All ressources have a version, you can use it to cache data or to modify a specific version
GET sample/v1/products/1
Headers :
etag: 123456
Body:
{
"id":"1",
"name":"banana",
"etag":"123456"
}
--> send a etag header and a etag inside the result
you can use header If-None-Match
to retrieve data only if modified, to modify data only if not modified
You can create a new ressource or a new field easily by calling POST (ex : /sample/v1/products). It will create an index in elasticsearch named sample_products
. If you want to manage precisely the types of your fields, you must use [elasticsearch mapping(https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html). By default, all the fields will considered as string, interger and dates depending on the first POST. If you want to change a type or delete a field you must delete the index before changing the mapping.
curl -XGET http://localhost:9200/sample_products/_mapping/products
curl -XPUT 'http://localhost:9200/sample_products/'
curl -XPUT 'http://localhost:9200/sample_products/_mapping/products' \
-d'{
"products": {
"properties":
{
"creation_date":{
"type":"date","format":"strict_date_optional_time||epoch_millis"
},
"etag":{
"type":"long"
},
"id":{
"type":"string"
},
"modification_date":{
"type":"date","format":"strict_date_optional_time||epoch_millis"
},
"name":{
"type":"string"
},
"price":{
"type":"long"
},
"status":{
"type":"string"
}
}
}
}'
Since data are stored in ELK you can make graphs with kibana, and use specific elastic request (aggregations,...). to use data you must configure an index pattern corresponding to your data (/sampple/v1/products will have sample_products index)
see kibana for more information
You can get a swagger template of your Restull API
Open directly in swagger editor http://editor.swagger.io/#/?import=http://api.restlastic.com/sample/v1/products/swagger.yaml&no-proxy
or download the file
GET http://api.restlastic/sample/v1/products/swagger.yaml
You can get a postman template of the Restull API
GET http://api.restlastic.com/sample/v1/products/postman.json
You can easily create data programmatically.
//import-data.js
var request = require('request');
var dns="http://localhost:1337/sample/v1/users/";
var body={};
// Create 1000 users
for (var i = 0; i < 1000; i++) {
body = { id: i, name: 'user' + i };
request.put({
url: dns + i,
body: body,
rejectUnauthorized: false,
json: true
}, function (err, res, body) {
if (err) {
console.log(err);
}
}
);
}
// Then launch $> node import-data.js
Tip use modules like faker, casual or chance to create random semantic data.
You can add your own routes using sails.js routes in /config/routes.js
file.
MIT - Restlastic
- json-server
- Homepage based on Bootstrap and Bootswatch
- Icons from Font Awesome
- Web fonts from Google