NoSuchMethodError: The getter 'start' was called on null.
richtrefz opened this issue · 2 comments
richtrefz commented
kubuntu 21.04
linkcheck --version
linkcheck version 2.0.19
linkcheck --no-show-redirects --debug https://www.topdumpsterrental.com/
Reading URLs:
https://www.topdumpsterrental.com/
Crawl will start on the following URLs: [https://www.topdumpsterrental.com/]
Crawl will check pages only on URLs satisfying: {https://www.topdumpsterrental.com/**}
Crawl will skip links that match patterns: UrlSkipper<>
Crawl will check the following servers (and their robots.txt) first: {www.topdumpsterrental.com}
Using 8 threads.
Checking robots.txt and availability of server: www.topdumpsterrental.com
Added: https://www.topdumpsterrental.com/ to Worker<1> with 0ms delay
Server check of www.topdumpsterrental.com complete.
Server check for www.topdumpsterrental.com complete: connected, robots.txt found.
Unhandled exception:
NoSuchMethodError: The getter 'start' was called on null.
Receiver: null
Tried calling: start
#0 _serializeSpan (package:linkcheck/src/origin.dart:37)
#1 Origin.toMap (package:linkcheck/src/origin.dart:29)
#2 Link.toMap (package:linkcheck/src/link.dart:60)
#3 FetchResults.toMap.<anonymous closure> (package:linkcheck/src/worker/fetch_results.dart:20)
#4 MappedListIterable.elementAt (dart:_internal/iterable.dart:412)
#5 ListIterator.moveNext (dart:_internal/iterable.dart:341)
#6 new _GrowableList._ofEfficientLengthIterable (dart:core-patch/growable_array.dart:188)
#7 new _GrowableList.of (dart:core-patch/growable_array.dart:150)
#8 new List.of (dart:core-patch/array_patch.dart:50)
#9 SetMixin.toList (dart:collection/set.dart:118)
#10 FetchResults.toMap (package:linkcheck/src/worker/fetch_results.dart:20)
#11 worker.<anonymous closure> (package:linkcheck/src/worker/worker.dart:193)
<asynchronous suspension>
Killing unresponsive Worker<1>
Done checking: https://www.topdumpsterrental.com/ (connection failed) => 0 links
- BROKEN
All jobs are done or user pressed Ctrl-C
Deduping destinations
Closing the isolate pool
Broken links
Done crawling.
Provided URLs failing:
https://www.topdumpsterrental.com/ (connection failed)
briandowd commented
- Ubuntu 20.04.2 LTS
- linkcheck version 2.0.19
- Dart SDK version: 2.14.2 (stable) (Unknown timestamp) on "linux_x64"
Debug results sanitized:
linkcheck https://server.example.com -d
Reading URLs:
https://server.example.com
Crawl will start on the following URLs: [https://server.example.com]
Crawl will check pages only on URLs satisfying: {https://server.example.com/**}
Crawl will skip links that match patterns: UrlSkipper<>
Crawl will check the following servers (and their robots.txt) first: {server.example.com}
Using 8 threads.
Checking robots.txt and availability of server: server.example.com
Added: https://server.example.com to Worker<1> with 0ms delay
Server check of server.example.com complete.
Server check for server.example.com complete: connected, robots.txt found.
Unhandled exception:
NoSuchMethodError: The getter 'start' was called on null.
Receiver: null
Tried calling: start
#0 Object.noSuchMethod (dart:core-patch/object_patch.dart:63:5)
#1 _serializeSpan (package:linkcheck/src/origin.dart:37:46)
#2 Origin.toMap (package:linkcheck/src/origin.dart:29:17)
#3 Link.toMap (package:linkcheck/src/link.dart:60:26)
#4 FetchResults.toMap.<anonymous closure> (package:linkcheck/src/worker/fetch_results.dart:20:44)
#5 MappedListIterable.elementAt (dart:_internal/iterable.dart:413:31)
#6 ListIterator.moveNext (dart:_internal/iterable.dart:342:26)
#7 new _GrowableList._ofEfficientLengthIterable (dart:core-patch/growable_array.dart:188:27)
#8 new _GrowableList.of (dart:core-patch/growable_array.dart:150:28)
#9 new List.of (dart:core-patch/array_patch.dart:50:28)
#10 ListIterable.toList (dart:_internal/iterable.dart:213:44)
#11 FetchResults.toMap (package:linkcheck/src/worker/fetch_results.dart:20:54)
#12 worker.<anonymous closure> (package:linkcheck/src/worker/worker.dart:193:66)
<asynchronous suspension>
Killing unresponsive Worker<1>
Done checking: https://server.example.com (connection failed) => 0 links
- BROKEN
All jobs are done or user pressed Ctrl-C
Deduping destinations
Closing the isolate pool
Broken links
Done crawling.
Provided URLs failing:
https://server.example.com (connection failed)
Error. Couldn't connect or find any links.
nico-deforge commented
Hello,
I am facing the same issue, finally, did you find a solution ?
Thank you in advance !