tFLOW Archival Function to Amazon S3

tFLOW

Archive to Amazon S3

Archival function allows you to mark completed orders and jobs as 'archived' and also to move the related files to a different location to free up online storage space.
You can find a detailed description of Archival here.
 
tFlow has the possibility to store the archived material into the cloud storage provided by Amazon Web Service (AWS).
The long term storage service is called S3 which provides access to an highly scalable, reliable, fast, inexpensive data storage infrastructure. 
More information on AWS S3 can be found here:
http://docs.aws.amazon.com/AmazonS3/latest/dev/Welcome.html
 
The configuration of tFlow to make possible the access to  S3 is quite simple.
 
 
 
1-Subscribe AWS service
Point your browser here: http://aws.amazon.com and create your account
When your account has been created then login into your AWS console.
 
 
 
2-create a new user.
Users and groups can be found under IAM (Identity and Access Management).
Click on Users on the left menu and then on the "Create User" button
 

enter a username:
 
 
 
Create a group and assign "AmazonS3FullAccess" rights:
 
 
 
Create the user and download the credential in a csv file
 
 
2-create a new bucket.
Amazon S3 is cloud storage for the Internet. To upload your data (photos, videos, documents etc.), you first create a bucket in one of the AWS Regions. 
Assign a bucket name, please avoid to use spaces, use underscore to separate words.
Then select the region where you would like the bucket will be created.
 
 
 
Now create a folder inside the bucket
 
 
3-Configure tFLOW
Login in tFLOW as admin and go to System Settings/Archiving,
 
Select S3 storage then enter the Access Key and the Secret Key you have downloaded in the csv file when you created the user:
 
 
 
In the url field enter the address using the following syntex:

    s3://bucket-name/folder-name
 
in our example URL will be:
 
    s3://tflow-archival/a1
 
IMPORTANT: Please do not store any other file into this bucket/folder, do not connect any other service and do not delete any folder or file! 

Press Save. If the provided credential are correct then the system will connect to S3 and will be ready to be used, else an error message will be shown.
 
 
 
The archived material will be stored in the selected bucket:
 

 
 
 
 

How to migrate archived items from Local to S3
 

In the case you are currently using the Local archiving and you would like to move to S3, then here is the procedure to move your archived material.
 
  1. Login with an FTP client to your archival storage (please contact Aleyant support if you do not have credentials)
  2. Download all the folders and and subfolders/files stored there to your local computer
    To not overload the server, please, download from FTP overnight.
  3. Login into your S3 console and upload all the folders there, including all the subfolders/files, etc.
During this procedure, please, stop using the Archival function in tFLOW to avoid altering the list of archived material, which could result in losing items.
 
Please, make sure to upload to the S3 bucket/folder you have set in tFLOW and preserve the folders naming and contents!
 
When this procedure is completed, then delete all the archived folders from Local storage using the FTP client.