FTP2SSH is a utility allowing to convert a plain bash access to file system to an FTP server in case of scp/sftp/smb absence. For example in case of access via kubernetes access or ongoing suppression of scp/sftp for whatever reasons.
- Download dist directory to a local folder.
- Create "configuration.json" file:
{"spawn": "<command to spawn a bash console>", "exit": "<command to exit the bash console>", "host": "127.0.0.1"}
- Run
./start.sh
orjava -jar ftp2Ssh-0.0.1-SNAPSHOT.jar
- Open Filezilla to localhost:8888 with a random login/password.
- Use an ftp access.
For the system where utility will be started:
- JAVA 1.8+
For the system which will access it over a bash-like console (ssh/kubectl/etc):
- base64
- split
- ls
- cat
- echo
- stat
- mkdir
- rm
- mv
- gzip (optional)
During the execution of STOR command files are read chunk by chunk. Each chunk then gets base64 encoded and stored in temporary folder as files. After the last chunk is stored, the command to merge all of the chunks into the destination file is executed.
RERT command encodes the remote file, then splits it into chunks and stores each chunk in the temporary folder. Then the utility reads each chunk and transfers it according to the requirements of ftp protocol.
In gzip mode utility compresses each chunk before base64 encoding.
Parameter | Type | Default / Example | Description |
---|---|---|---|
port | int | 8888 | Port to listen |
host | String | 0.0.0.0 | Hostname to listen |
spawn | String | sshpass -p {{PASS}} ssh {{USER}}@127.0.0.1 (Default: bash ) |
Command to spawn a new console. Available parameters {{USER}} - login of the ftp user, {{PASS}} - password of the ftp user |
chunkSize | int | 10000 | Amount of data in bytes to transfer at once. Should not be too big as it is written by echo "..." >>... by default. |
tmpFolder | String | /tmp | Folder to store chunks of the file to transfer. Has to be writable for the spawned bash session. |
timeoutSec | int | 30 | Timeout in seconds for a command to execute or for a chunk to transfer. |
maxLayers | int | 10 | Maximum of spawned bash sessions for one ftp session. Minimum is 2. |
gzip | boolean | false | Use gzip to archive data before sending it over a console. |
Parameter | Type | Default / Example | Description |
---|---|---|---|
prepareFileToRetr | String | base64 -w {{CHUNK_SIZE}} {{FILE}} | split -l 1 - {{TMP}}{{DS}} | Command to split a remote file to retrieve. The command should create a set of files in the temporary folder to retrieve in alphabetical order. Each file has to be base64 encoded. {{CHUNK_SIZE}} - size of a chunk in bytes, {{FILE}} - file to retrieve {{TMP}} - temporary folder to split a file {{DS}} - directory separator |
fileSlice | String | dd "if={{FROM}}" "of={{TO}}" bs={{SLICE_SIZE}} count={{SLICE_COUNT}} skip={{SLICE_OFFSET}} | Command to copy part of a remote file. {{FROM}} - name of a file, {{TO}} - where to copy {{SLICE_SIZE}} - size of a chunk {{SLICE_COUNT}} - number of chunks to copy {{SLICE_OFFSET}} - number of chunks to offset from the file start {{SIZE}} - number of bytes to copy (SLICE_SIZE * SLICE_COUNT) {{OFFSET}} - number of bytes to skip from the start (SLICE_SIZE * SLICE_OFFSET) |
retrPiece | String | cat {{FILE}} | Command to print a chunk to stdout. {{FILE}} - the name of the file to print |
storePiece | String | echo {{PIECE}} >>{{TMPFILE}} | Command to store a chunk of data. {{PIECE}} - a base64 encoded chunk {{TMPFILE}} - name of the file to store |
joinStoredPieces | String | base64 -d {{TMPFILE}} >>{{FILE}} | Base64 decode a chunk and store it to the file. {{TMPFILE}} - name of the file storing the chunk {{FILE}} - name of the destination file |
echo | String | echo | Print something to stdout. Used as echo "Some escaped text" |
cd | String | cd | Change directory. Used as cd "/escaped/path" |
pwd | String | pwd | Print current directory. |
lsla | String | ls -la --time-style="+%Y-%m-%dT%H:%M" | Enumerate files in directory. Used as ls -la --time-style="+%Y-%m-%dT%H:%M" "/escaped/path" . |
lsw1 | String | ls -w1 | Print files in directory one per line. Used as ls -w1 "/escaped/path" . |
size | String | stat {{FILE}} | grep -oEh "Size: [0-9]+" | cut -b 7- | Print file size in bytes. {{FILE}} - filename to get size |
mkdir | String | mkdir | Create directory. Used as mkdir "/path/to/directory" . |
mv | String | mv {{FROM}} {{TO}} | Move file. {{FROM}} - filename to move. {{TO}} - destination path |
rmrf | String | rm -rf | Remove a file or a directory recursively. Used as rm -rf "/path/to/remove" |
exit | String | exit | Gracefully close the console. |
DS | String | / | Directory Separator. |
If gzip configuration is set to true, the utility assumes that chunks are gzipped before transfer. Chunks stored in the temporary folder during STOR will be gzipped and base64 encoded. Also each chunk has to be gzipped and base64 encoded before RETR.
Default commands in gzip mode:
Command | Default |
---|---|
joinStoredPieces | base64 -d {{TMPFILE}} | gzip -d -c - >>{{FILE}} |
prepareFileToRetr | split -b {{CHUNK_SIZE}} --additional-suffix .split {{FILE}} {{TMP}}{{DS}} && gzip {{TMP}}{{DS}}* && for l in `ls {{TMP}}`; do base64 {{TMP}}{{DS}}$l >{{TMP}}{{DS}}$l.b64; done && rm {{TMP}}{{DS}}*.split.gz |
Ftp over bash to the local computer. Just for example.
{
"spawn": "bash",
"host": "0.0.0.0",
"exit": "exit"
}
Ftp over ssh to remote computer. In case of scp/sftp is disabled somehow.
{
"spawn": "sshpass -p {{PASS}} ssh {{USER}}@127.0.0.1",
"host": "0.0.0.0",
"exit": "exit"
}
Bash over kubectl, which is my use case. Create file sh-pod.sh:
#!/bin/bash
l=`kubectl get pods | grep $1 | cut --delimiter=' ' -f1`;
kubectl exec $l -it sh
Add execution mode to the file: chmod +x sh-pod.sh
Configuration file:
{
"spawn": "./sh-pod.sh {{USER}}",
"host": "0.0.0.0",
"exit": "exit"
}
Don't forget to start the utility from its directory or to specify the full path to the sh-pod file.
- Bash injections. Some injections similar to
mkdir "$(rm -rf /)"` or `rm "`whoami`"
are escaped, but this still requires some investigation. - Windows support. Have not been tested yet.
- Better way to stor/retr files. Currently working with big files is impossible.