[error] app.ddlog[56.48] failure: `json' expected but `t' found
eromoe opened this issue · 7 comments
Hi, example is out date http://deepdive.stanford.edu/example-spouse
I tried by below steps
root@2f46265d3bfb:/data/deepdive/examples/spouse# input/articles.tsj.sh
ERROR: Missing /data/deepdive/examples/spouse/input/signalmedia/signalmedia-1m.jsonl
# Please Download it from http://research.signalmedia.co/newsir16/signal-dataset.html
# Alternatively, use our sampled data by running:
deepdive load articles input/articles-100.tsv.bz2
# Or, skipping all NLP markup processes by running:
deepdive create table sentences
deepdive load sentences
deepdive mark done sentences
root@2f46265d3bfb:/data/deepdive/examples/spouse# deepdive load articles input/articles-100.tsv.bz2
/data/deepdive/examples/spouse: Not compiled yet, please run first: deepdive compile
Loading articles from input/articles-100.tsv.bz2 (tsv format)
psql: FATAL: database "deepdive_spous" does not exist
loading articles: 0:00:00 12 [ 129 /s] ([ 129 /s])
loading articles: 0:00:00 132KiB [1.38MiB/s] ([1.38MiB/s])
root@2f46265d3bfb:/data/deepdive/examples/spouse# deepdive compile
2017-07-20 01:20:22.371711 ‘run/LATEST.COMPILE’ -> ‘20170720/012022.323496981’
2017-07-20 01:20:22.371829 ‘run/RUNNING.COMPILE’ -> ‘20170720/012022.323496981’
2017-07-20 01:20:22.371854 Parsing DeepDive application (/data/deepdive/examples/spouse) to generate:
2017-07-20 01:20:22.371867 run/compiled/schema.json
2017-07-20 01:20:22.371879 from app.ddlog
2017-07-20 01:20:22.812800 [error] app.ddlog[56.48] failure: `json' expected but `t' found
2017-07-20 01:20:22.812857
2017-07-20 01:20:22.812877 implementation "udf/nlp_markup.sh" handles tsj lines.
2017-07-20 01:20:22.812893
2017-07-20 01:20:22.812910 ^
‘run/ABORTED.COMPILE’ -> ‘20170720/012022.323496981’
but got error [error] app.ddlog[56.48] failure:
json' expected but t' found
I also tried the notebook inside that example, in !deepdive complile
section got error
2017-07-20 01:35:00.843402 ‘run/LATEST.COMPILE’ -> ‘20170720/013500.796026536’
2017-07-20 01:35:00.843518 ‘run/RUNNING.COMPILE’ -> ‘20170720/013500.796026536’
2017-07-20 01:35:00.843540 Parsing DeepDive application (/data/deepdive/examples/spouse) to generate:
2017-07-20 01:35:00.843553 run/compiled/schema.json
2017-07-20 01:35:00.843565 from app.ddlog
2017-07-20 01:35:01.248310 run/compiled/deepdive.conf
2017-07-20 01:35:01.248438 from app.ddlog
2017-07-20 01:35:01.700686 from deepdive.conf
2017-07-20 01:35:01.704035 run/compiled/deepdive.conf.json
2017-07-20 01:35:02.030732 Performing sanity checks on run/compiled/deepdive.conf.json:
2017-07-20 01:35:02.081543 checking if input_extractors_well_defined
2017-07-20 01:35:02.081608 checking if input_schema_wellformed
2017-07-20 01:35:02.082005 Normalizing and adding built-in processes to the data flow to compile:
2017-07-20 01:35:02.082525 run/compiled/config-0.00-init_objects.json
2017-07-20 01:35:02.099576 run/compiled/config-0.01-parse_calibration.json
2017-07-20 01:35:02.111656 run/compiled/config-0.01-parse_schema.json
2017-07-20 01:35:02.140960 run/compiled/config-0.51-add_init_app.json
2017-07-20 01:35:02.150884 run/compiled/config-0.52-input_loader.json
2017-07-20 01:35:02.170715 run/compiled/config-1.00-qualified_names.json
2017-07-20 01:35:02.194299 run/compiled/config-1.01-parse_inference_rules.json
2017-07-20 01:35:02.240167 run/compiled/config-2.01-grounding.json
2017-07-20 01:35:02.388125 run/compiled/config-2.02-learning_inference.json
2017-07-20 01:35:02.408455 run/compiled/config-2.03-calibration_plots.json
2017-07-20 01:35:02.421271 run/compiled/config-9.98-ensure_init_app.json
2017-07-20 01:35:02.432246 run/compiled/config-9.99-dependencies.json
2017-07-20 01:35:02.451437 run/compiled/config.json
2017-07-20 01:35:02.452824 Validating run/compiled/config.json:
2017-07-20 01:35:02.525518 checking if compiled_base_relations_have_input_data
2017-07-20 01:35:02.525573 [ERROR] base relation 'has_spouse' must have data to load at: input/has_spouse.*
2017-07-20 01:35:02.525592 [ERROR] FAILED deepdive check compiled_base_relations_have_input_data
2017-07-20 01:35:02.525608 checking if compiled_dependencies_correct
2017-07-20 01:35:02.525622 checking if compiled_input_output_well_defined
2017-07-20 01:35:02.525636 checking if compiled_output_uniquely_defined
‘run/ABORTED.COMPILE’ -> ‘20170720/013500.796026536’
And deepdive would not automaticlly create database
root@2f46265d3bfb:/data/deepdive/examples/spouse# deepdive load articles input/articles-1000.tsv.bz2
/data/deepdive/examples/spouse: Not compiled yet, please run first: deepdive compile
Loading articles from input/articles-1000.tsv.bz2 (tsv format)
psql: FATAL: database "deepdive_spous" does not exist
loading articles: 0:00:00 0 [ 0 /s] ([ 0 /s])
loading articles: 0:00:00 192KiB [2.12MiB/s] ([2.12MiB/s])
postgres=# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+-----------+---------+-------+-----------------------
postgres | postgres | SQL_ASCII | C | C |
template0 | postgres | SQL_ASCII | C | C | =c/postgres +
| | | | | postgres=CTc/postgres
template1 | postgres | SQL_ASCII | C | C | =c/postgres +
| | | | | postgres=CTc/postgres
(3 rows)
After I manually create database, then got:
Loading articles from input/articles-100.tsv.bz2 (tsv format)
ERROR: relation "articles" does not exist
loading articles: 0:00:00 12 [ 125 /s] ([ 125 /s])
loading articles: 0:00:00 132KiB [1.34MiB/s] ([1.34MiB/s])
Is there any real working guide ??
In the first log:
/data/deepdive/examples/spouse: Not compiled yet, please run first: deepdive compile
I think you missed compilation step. Before load data try to compile deepdive config file:
deepdive compile
@mcavdar He said that "I also tried the notebook inside that example, in !deepdive complile section got error"
@mcavdar
I still can't understand why it can't work
➜ spouse git:(master) ✗ ll udf/nlp_markup.sh
-rwxrwxrwx 1 Echo staff 1.1K 11 9 16:59 udf/nlp_markup.sh
➜ spouse git:(master) ✗ ll input
total 31112
-rw-r--r-- 1 Echo staff 136K 11 9 16:59 articles-100.tsj.bz2
-rw-r--r-- 1 Echo staff 136K 11 9 16:59 articles-100.tsv.bz2
-rw-r--r-- 1 Echo staff 1.2M 11 9 16:59 articles-1000.tsj.bz2
-rw-r--r-- 1 Echo staff 1.2M 11 9 16:59 articles-1000.tsv.bz2
-rwxr-xr-x 1 Echo staff 711B 11 9 16:59 articles.tsj.sh
-rwxrwxrwx 1 Echo staff 209K 11 10 14:45 has_spouse
-rw-r--r-- 1 Echo staff 653K 11 9 16:59 sentences-100.tsj.bz2
-rw-r--r-- 1 Echo staff 642K 11 9 16:59 sentences-100.tsv.bz2
-rw-r--r-- 1 Echo staff 5.6M 11 9 16:59 sentences-1000.tsj.bz2
-rw-r--r-- 1 Echo staff 5.4M 11 9 16:59 sentences-1000.tsv.bz2
-rwxrwxrwx 1 Echo staff 76K 11 9 16:59 spouses_dbpedia.csv.bz2
error is as same as:
2017-07-20 01:35:02.525573 [ERROR] base relation 'has_spouse' must have data to load at: input/has_spouse.*
2017-07-20 01:35:02.525592 [ERROR] FAILED deepdive check compiled_base_relations_have_input_data
@nlpjoe Did you touch app.ddlog file? What version of deepdive do you use ?(Also spouse example?) BTW I just realised a strange line in the first output.
psql: FATAL: database "deepdive_spous" does not exist
It should be deepdive_spouse_$USER.
Can you please let us know which steps do you execute, from scratch.
deepdive compile not working where it is giving some json error on how to debug it