dalibo/sqlserver2pgsql

Question: Running the before script

Closed this issue · 5 comments

I am having an issue running the before script on the command
psql -h localhost -p 5432 -U docker docker -f output_before_script.sql
with the error saying there is no file
psql: error: output_before_script.sql: No such file or directory

Any suggestions on what could be the cause with this?

I have run the script to create the files but I am at an impasse.
./sqlserver2pgsql.pl -f sqlserver_sql_dump \ -b output_before_script \ -a output_after_script \ -u output_unsure_script

Hi,
Have you any message when running the sqlserver2pgsql.pl command ?
Do you see other output files in the current directory (where you put your dump file and the tool) ?
Do you see the output of a simple ./sqlserver2pgsql command (without options) ?
I see a "docker" database and role in your psql command. I am not a docker expert at all. But if you execute the tool inside a container, may it happen that the ouput files be unvisible from outside ?
Philippe.

Hi, Have you any message when running the sqlserver2pgsql.pl command ? Do you see other output files in the current directory (where you put your dump file and the tool) ? Do you see the output of a simple ./sqlserver2pgsql command (without options) ? I see a "docker" database and role in your psql command. I am not a docker expert at all. But if you execute the tool inside a container, may it happen that the ouput files be unvisible from outside ? Philippe.

I see no specific messages when I run the command, it outputs the files correctly as expected.
The docker database is my mssql database that I need to convert, maybe networking the container to a container containing the output files is the proper solution. Will update with any findings!

the perl output files are without extensions by default - e.g. output_before_script not output_before_script.sql

can/have you tried the same command w/o extension:
psql -h localhost -p 5432 -U docker docker -f output_before_script

Actually going to close this as I have found a workaround for my issue

@shuby722 could you post your workaround ?