IPFS hash link keep loading without not responding
0xisk opened this issue · 2 comments
Hi,
I'm trying to add many JSON files in IPFS using the following API:
client.add('item.json')
And here is the output of the item.json
file
{"name":"Musk","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0003.jpeg"}
The returned hash for this item.json
works successfully, but when I'm trying to add another JSON file that's slightly different from the first one, the output IPFS hash keeps loading for a long time without any outputs.
And here is the output of the second item2.json
file
{"name":"Musk2","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0004.jpeg"}
And that happens also many times if I just changed the name of the value with anything other than "Musk".
And when I try with localhost IPFS domain it works http://localhost:8081/ipfs/<hash>
but not with the main ipfs link https://ipfs.io/ipfs/<hash>
I need your help to understand this behavior.
Thanks.
Hi,
I'm trying to add many JSON files in IPFS using the following API:
client.add('item.json')
And here is the output of theitem.json
file
{"name":"Musk","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0003.jpeg"}
The returned hash for this
item.json
works successfully, but when I'm trying to add another JSON file that's slightly different from the first one, the output IPFS hash keeps loading for a long time without any outputs.And here is the output of the second
item2.json
file
{"name":"Musk2","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0004.jpeg"}
And that happens also many times if I just changed the name of the value with anything other than "Musk".
Do you have a script that can reproduce this?
And when I try with localhost IPFS domain it works
http://localhost:8081/ipfs/<hash>
but not with the main ipfs linkhttps://ipfs.io/ipfs/<hash>
This happens if your IPFS server is the only server with the requested content, but port 4001
cannot be reached from the public Internet. For example, if you run IPFS on your desktop or laptop, and do not have port 4001 forwarded on your network from the outside all the way to your server, your server is unreachable to HTTP proxies (e.g. ipfs.io) and unreachable from other IPFS servers - and so is the content.
You can fix this by either setting up the port forwarding, or by running IPFS on a server with a public IP address.
Thanks so much for your helping,
Hi,
I'm trying to add many JSON files in IPFS using the following API:
client.add('item.json')
And here is the output of theitem.json
file
{"name":"Musk","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0003.jpeg"}
The returned hash for thisitem.json
works successfully, but when I'm trying to add another JSON file that's slightly different from the first one, the output IPFS hash keeps loading for a long time without any outputs.
And here is the output of the seconditem2.json
file
{"name":"Musk2","description":"description","image":"https://ipfs.io/ipfs/QmP8ZD7YrSyPWaU1UnTD9jFR6rwokncCCd3WK7f7eSJi11/0004.jpeg"}
And that happens also many times if I just changed the name of the value with anything other than "Musk".Do you have a script that can reproduce this?
Yes, I have, here is a simple Py script for reading (10,000 images) and uploading them to IPFS. Then doing some data collections in JSON files (10,000 JSON files) and upload them to IPFS. At the end writing all the IPFS hashes in files.
import json
import os
import glob
import ipfshttpclient
# Hashes for all the items
ipfs_hashes = []
# A list of image hashes after uploading to IPFS
img_hashes = []
# Hashes for all the JSON items
obj_hashes = []
# A list of JSON hashes after uploading to IPFS
final_hashed = []
# Reading all images from /items directory
with ipfshttpclient.connect() as client:
ipfs_hashes = client.add('items', pattern='*.jpg')
for obj in ipfs_hashes:
if obj['Name'] == "items":
break
obj_name = obj['Name']
split_str_1 = obj_name.split('/')
split_str_2 = split_str_1[1].split('.')
image = dict({
"name": split_str_2[0],
"description": "description",
"image": "https://ipfs.io/ipfs/" + obj['Hash']
})
img_hashes.append(dict({
"name": split_str_2[0],
"description": "description",
"hash": obj['Hash']
}))
image_json_file = open("json_items/"+split_str_2[0]+".json", "w+")
image_json_file.write(json.dumps(image))
image_json_file.close()
obj_hashes = client.add('json_items', pattern='*.json')
for obj in obj_hashes:
if obj['Name'] == "json_items":
break
item_json_hash = obj['Hash']
final_hashed.append(item_json_hash)
json_object_hashes = json.dumps(final_hashed, indent=4)
hashes_json_file = open("hashed.json", "w")
hashes_json_file.write(json_object_hashes)
hashes_json_file.close()
json_img_hashes = json.dumps(img_hashes, indent=4)
img_hashes_file = open("image_hashes.json", "w")
img_hashes_file.write(json_img_hashes)
img_hashes_file.close()
And here is the output of IPFS daemon:
Initializing daemon...
go-ipfs version: 0.7.0-ea77213e3
Repo version: 10
System version: amd64/linux
Golang version: go1.15.2
Swarm listening on /ip4/127.0.0.1/udp/4001/quic
Swarm listening on /ip4/172.17.0.1/tcp/4001
Swarm listening on /ip4/172.17.0.1/udp/4001/quic
Swarm listening on /ip4/192.168.1.104/tcp/4001
Swarm listening on /ip4/192.168.1.104/udp/4001/quic
Swarm listening on /ip6/::1/tcp/4001
Swarm listening on /ip6/::1/udp/4001/quic
Swarm listening on /ip6/fd74:ae1:c698:db00:da5c:3a2c:f128:5c03/tcp/4001
Swarm listening on /ip6/fd74:ae1:c698:db00:da5c:3a2c:f128:5c03/udp/4001/quic
Swarm listening on /p2p-circuit
Swarm announcing /ip4/127.0.0.1/tcp/4001
Swarm announcing /ip4/127.0.0.1/udp/4001/quic
Swarm announcing /ip4/192.168.1.104/tcp/4001
Swarm announcing /ip4/192.168.1.104/udp/4001/quic
Swarm announcing /ip6/::1/tcp/4001
Swarm announcing /ip6/::1/udp/4001/quic
API server listening on /ip4/127.0.0.1/tcp/5001
WebUI: http://127.0.0.1:5001/webui
Gateway (readonly) server listening on /ip4/127.0.0.1/tcp/8081
Daemon is ready
Here is the output of running sudo lsof -i:4001
to check what's running on port 4001
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
pfs 22662 isk 12u IPv6 63989 0t0 TCP *:newoak (LISTEN)
ipfs 22662 isk 13u IPv4 57325 0t0 TCP *:newoak (LISTEN)
ipfs 22662 isk 14u IPv4 57327 0t0 UDP *:newoak
ipfs 22662 isk 15u IPv6 59976 0t0 UDP *:newoak
ipfs 22662 isk 20u IPv4 61387 0t0 TCP archlinux:newoak->17-39-17-223-on-nets.com:60590 (ESTABLISHED)
ipfs 22662 isk 23u IPv4 61402 0t0 TCP archlinux:newoak->123-205-7-246.adsl.dynamic.seed.net.tw:47745 (SYN_SENT)
ipfs 22662 isk 24u IPv4 69759 0t0 TCP archlinux:newoak->178.44.185.50:40019 (ESTABLISHED)
ipfs 22662 isk 25u IPv4 68058 0t0 TCP archlinux:newoak->r33-146-38-77-broadband.btv.lv:45775 (ESTABLISHED)
ipfs 22662 isk 26u IPv4 61370 0t0 TCP archlinux:newoak->220.95.152.129:60474 (ESTABLISHED)
ipfs 22662 isk 27u IPv4 69763 0t0 TCP archlinux:newoak->ec2-35-81-238-43.us-west-2.compute.amazonaws.com:newoak (ESTABLISHED)
ipfs 22662 isk 28u IPv4 69708 0t0 TCP archlinux:newoak->1-65-164-048.static.netvigator.com:59978 (ESTABLISHED)
ipfs 22662 isk 29u IPv4 63003 0t0 TCP archlinux:newoak->228-27-145-85.ftth.glasoperator.nl:newoak (ESTABLISHED)
ipfs 22662 isk 30u IPv4 64036 0t0 TCP archlinux:newoak->static.192.233.217.95.clients.your-server.de:newoak (ESTABLISHED)
ipfs 22662 isk 31u IPv4 60090 0t0 TCP archlinux:newoak->175.212.105.13:33954 (ESTABLISHED)
This happens if your IPFS server is the only server with the requested content, but port
4001
cannot be reached from the public Internet. For example, if you run IPFS on your desktop or laptop, and do not have port 4001 forwarded on your network from the outside all the way to your server, your server is unreachable to HTTP proxies (e.g. ipfs.io) and unreachable from other IPFS servers - and so is the content.You can fix this by either setting up the port forwarding, or by running IPFS on a server with a public IP address.
So do you still think that this is the same issue?