Airtable connector fails
Opened this issue · 1 comments
kikoncuo commented
When reading from a successfully connected airtable connector, the connector looks for a file on the local cache, no file exists and the sync fails.
I tried to simplify the example as much as possible:
# Import PyAirbyte
import airbyte as ab
# Create and install airtable connector
source: ab.Source = ab.get_source("source-airtable")
# Configure the source
source.set_config(
config={
"credentials": {
"auth_method": "api_key",
"api_key": "MyKey"
},
},
)
# Verify the config and creds by running `check`:
source.check()
source.select_streams("*")
read_result: ab.ReadResult = source.read()
# Print the read result
print("Read result:")
print(read_result)
Error:
FileNotFoundError: [Errno 2] No such file or directory: '.cache/default_cache/talentclass/all_leads/tblnpRi7u0YECVout_01J01NPY9AKJ84BQKKK6S91BNR.jsonl.gz'
Full logs:
(myenv) enrique@Enrique-TUF:~/pyAirbyteTests$ python airtableTests.py
Connection check succeeded for `source-airtable`.
Started `source-airtable` read operation at 20:13:59...
Failed `source-airtable` read operation at 20:14:04.
Traceback (most recent call last):
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/sources/base.py", line 734, in read
cache_processor.process_airbyte_messages(
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_future_cdk/record_processor.py", line 196, in process_airbyte_messages
self.process_record_message(
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_future_cdk/sql_processor.py", line 237, in process_record_message
self.file_writer.process_record_message(
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_processors/file/base.py", line 155, in process_record_message
batch_handle = self._new_batch(stream_name=stream_name)
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_processors/file/base.py", line 106, in _new_batch
batch_handle = BatchHandle(
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_batch_handles.py", line 30, in __init__
self._open_file_writer: IO[bytes] = file_opener(self._files[0])
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_processors/file/jsonl.py", line 34, in _open_new_file
return cast(IO[bytes], gzip.open(file_path, "w"))
File "/usr/lib/python3.10/gzip.py", line 58, in open
binary_file = GzipFile(filename, gz_mode, compresslevel)
File "/usr/lib/python3.10/gzip.py", line 174, in __init__
fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')
FileNotFoundError: [Errno 2] No such file or directory: '.cache/default_cache/talentclass/all_leads/tblnpRi7u0YECVout_01J01NPY9AKJ84BQKKK6S91BNR.jsonl.gz'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/enrique/pyAirbyteTests/airtableTests.py", line 23, in <module>
read_result: ab.ReadResult = source.read()
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/sources/base.py", line 745, in read
raise exc.AirbyteConnectorFailedError(
airbyte.exceptions.AirbyteConnectorFailedError: AirbyteConnectorFailedError: Connector failed.
Log output:
Starting syncing SourceAirtable
Marking stream talentclass/all_leads/tblnpRi7u0YECVout as STARTED
Syncing stream: talentclass/all_leads/tblnpRi7u0YECVout
Marking stream talentclass/all_leads/tblnpRi7u0YECVout as RUNNING
Exception ignored in: <function BatchHandle.__del__ at 0x7f3433d47e20>
Traceback (most recent call last):
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_batch_handles.py", line 84, in __del__
self.close_files()
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_batch_handles.py", line 66, in close_files
if self.open_file_writer is None:
File "/home/enrique/pyAirbyteTests/myenv/lib/python3.10/site-packages/airbyte/_batch_handles.py", line 62, in open_file_writer
return self._open_file_writer
AttributeError: 'BatchHandle' object has no attribute '_open_file_writer'
Read Progress
Started reading at 18:14:00.
Read 0 records over 3 seconds (0.0 records / second).
ronneldavis commented
I am facing the same issue with Airtable but using Oauth2 credentials, would be good to have some documentation around this.