Easy and scalable solution for mysql database backup to AWS S3 bucket

We have been working with a client who has a website based on SocialEngine with fairly heavy user traffic and lots of data. As a popular website and data changing on daily basis, we have to take data backup every day just in case something happens and we need to restore it. Storing the backup on a local server needs more space which is a costlier approach in terms of hardware. So we were exploring the other options to store the data securely, efficiently and in relatively cheaper.

We researched and found that Amazon S3 option is the best for our requirements.  The costing for S3 is much cheaper than storing in the SSD hard drive and won’t require any maintenance. We didn’t find any third party plugin who can handle our requirements to take the backup of our website. There were few backup plugins but failed to take backup properly.

Amazon S3 saves the data as secure object storage. It lets you preserve, retrieve and restore every version of every object in an Amazon S3 bucket. So, you can easily recover if something is accidentally deleted by users or applications failure. With Amazon S3 you only pay for the storage you actually use. There is no minimum fee and no setup cost. So we decided to use this option to store data for the website.

The best thing about using s3 is that AWS provides S3 utility which can take care of storing the data in the relevant bucket. You just need to run the command in the shell script and data is stored in S3 bucket without any hassle. So we are sharing the script on Github so that you can directly use it for your SocialEngine website.


Steps to take back up of website based on SocialEngine

We are going to go through the steps in details to take the backup of the database. If you would like then you can skip the next steps and directly download the script for your website though we would like you to read the full article.

Here is the checklist for your server:

  • S3cmd  command line configures on the server.
  • A bucket over S3 to store dump file (click to create S3 bucket).
  • Make Bash Script i.e Contains MySQL Credential ( Hostname, Username, Password, DBName ), Location on your server where you want to store dump (PATH), Log Path.
  • Give chmod +x on mysql backup Script (
  • Test it and check S3 bucket
  • Schedule backup MySQL database with crontab as per your requirement.


MySQL Database Backup

Step 1. Install S3cmd:

Download the latest version of the s3cmd using this link ‘

sudo apt-get -y install python-setuptools
tar xvfz s3cmd-1.6.0.tar.gz
cd s3cmd-1.6.0
sudo python install

In which You will be asked for the two keys (Access key and Secret key are your identifiers for Amazon S3.) – copy and paste them from your confirmation email or from your Amazon account page.

They are case sensitive and must be entered accurately or you’ll keep getting errors about invalid signatures or similar.

You can optionally enter a GPG encryption key that will be used for encrypting your files before sending them to Amazon. Using GPG encryption will protect your data against reading by Amazon staff or anyone who may get access to your them while they’re stored at Amazon S3.

Other advanced settings can be changed (if needed) by editing the config file manually. Some of the settings contain the default values for s3cmd to use.

The following is an example of an s3cmd config file: (.s3cfg).


access_key = <your key>
access_token = 
add_encoding_exts = 
add_headers = 
bucket_location = Mumbai 
ca_certs_file = 
cache_file = 
check_ssl_certificate = True
check_ssl_hostname = True
cloudfront_host =
default_mime_type = binary/octet-stream
delay_updates = False
delete_after = False
delete_after_fetch = False
delete_removed = False
dry_run = False
enable_multipart = True
encoding = UTF-8
encrypt = False
expiry_date = 
expiry_days = 
expiry_prefix = 
follow_symlinks = False
force = False
get_continue = False
gpg_command = /usr/bin/gpg
gpg_decrypt = %(gpg_command)s -d --verbose --no-use-agent --batch --yes --passphrase-fd %(passphrase_fd)s -o %(output_file)s %(input_file)s
gpg_encrypt = %(gpg_command)s -c --verbose --no-use-agent --batch --yes --passphrase-fd %(passphrase_fd)s -o %(output_file)s %(input_file)s
gpg_passphrase = 
guess_mime_type = True
host_base =
host_bucket = %(bucket)
human_readable_sizes = False

Step 2. MySql Database Backup Script S3:

The shell script used to back up your database and upload it to S3.

The idea is to create the following script and run it with the appropriate environment variables, and what it does is actually pretty simple, and then uses mysqldump to dump the database to a temporary file, and It uploads the file to S3 by using the AWS CLI Tools for S3.


## Specify the name of the database that you want to backup

# Database credentials

TSTAMP=$(date +"%d-%b-%Y-%H-%M-%S")

#mysqldump  -h <HOST>  -u <USER>  --database <DB_NAME>  -p"password" > $BACKUPROOT/$DB_NAME-$TSTAMP.sql


mysqldump -h$HOST -u$USER $DB_NAME -p$PASSWORD | gzip -9 > $BACKUPROOT/$DB_NAME-$TSTAMP.sql.gz

if [ $? -ne 0 ]
 mkdir /tmp/$TSTAMP
 s3cmd put -r /tmp/$TSTAMP $S3BUCKET/
 s3cmd sync -r $BACKUPROOT/ $S3BUCKET/$TSTAMP/
 rm -rf $BACKUPROOT/*
 s3cmd sync -r $BACKUPROOT/ $S3BUCKET/$TSTAMP/
 rm -rf $BACKUPROOT/*

Step 3.  Let’s run the script now:

Make can be called in various ways. For example:

# chmod +x
# Run the script to make sure it's all good
# bash

Backup script Output:-


Step 4. Schedule IT With CRONTAB:

Assuming the backup script is stored in /opt/scripts  directory we need to add a crontab task to run it automatically on weekly basis:

 So we have to edit the crontab  file 

 #vim /etc/crontab
 #Add the following lines:
 #Run the database backup script on every week at 12.00

 0 0 * * 0  bash /opt/scripts/ to  >/dev/null 2>&1



In this article, we have explained how to automate the backup process  MySQL directly on AWS S3 Bucket.  Just schedule and configure the script on you MySql server and you are done. This service runs the process on regular basis according to your pre-configured schedule. Feel free to ask any query, glad to hear from you.


For further reading

How to design custom sign in UI for AWS Cognito Mobile App?

If you are struggling with the idea how to design AWS Cognito custom signin UI and not able to change your login screen with your own ideas or requirements, then this blog is for you. In this blog, you get the solution to customise the AWS sign-in view of your app. Recently, while working on one of the projects of our client we came up with the issue to not able to change the UI of the login screen. It was quite challenging for us too. We struggled a lot and did too many research on google but the result was disappointing. Somewhere we found that cloning the SignInActivity will help, but it did not. Lastly, we started to work ourselves and after long efforts, we came up with a solution and guess what?– it worked :-).

The default activity of Amazon Cognito user sign-in is not very attractive and surprisingly hard to use. To customise the view you need to do customise in 4 classes of the AWS Mobile Auth Userpools and they are SignInActivity,  SignInViewAuthUIConfiguration and  UserPoolSignInView, also need to use some feature of CognitoUserPoolsSignInProvider. These all classes are inter-related to each other and so to access all components of the sign-in view every class need changes. Below are some more details of the changes that you need to do in these classes.

Compone AWS Cognito Custom SignIn UI

SigInActivity – This file is the duplicate of the AWS Mobile Auth UI SigInActivity.  In this, you need to put your own  AuthUIConfiguration class.

SignInView – This is the most important class that you need to modify. This also the clone of the SignInView of the AWS Mobile Auth UI and in this class you have to use your AuthUIConfiguration and SignInActivity references rather than default classes of the same provided by the AWS Mobile Auth UI. You need to give a complete reference to your UserPoolSignInView class in the variable  USER_POOL_SIGN_IN_VIEW so that your signin activity use your  UserPoolSignInView rather than default one. Here you can set your logo, background image/color to be used on the signin screen.

UserPoolSignInView – Here you can change your sign-in button design like – size, color, and style.

AuthUIConfiguration – This class stores Configuration information related to the SignIn UI screen. You just need to copy the default AuthUIConfiguration class of the AWS mobile Auth UI.

CustomCognitoUserPoolsSignInProvider – This class extends the CognitoUserPoolsSignInProvider and implements all the necessary classes need to manage sign-in using Cognito User Pools. So wherever the CognitoUserPoolsSignInProvider reference has been used in your application, you need to use this class. Here you can also implement the feature of progress-bar according to your signin results.

Some of the more classes that you need to add or change are – activity_sign_in (your sign-in layout),  horizontal_or_sign_in_divider, horizontal_sign_in_divider.


For sample App click here


That all and finally you will get the desired UI of your sign-in view. Though there are many limitations to the changes in the UI, you can change your sign-in view to a more beautiful view as compare to default view provided by the Amazon Cognito. Above is the small sample to demonstrate the changes that are needed to be changed. For more information, you can contact us through our website. Enjoy Coding… 🙂


AWS-SDK-Android – It provides a library and documentation for developers to build connected mobile applications using AWS.

Further Reading

How to Migrate existing Express app to AWS Serverless architecture?

In the previous blog, we have discussed How to create a dynamic website using serverless architecture? Yeah its great to use AWS Serverless architecture, but what should I do if I have an existing app? Should I rebuild it with it from scratch? Don’t worry find out the solution here. In this blog, we will cover all the steps to migrate your existing app to AWS serverless.

Why AWS serverless?

First of all, it is important to address why serverless apps are favored over traditional server-hosted apps. There are a couple of reasons lets focus some of them

  • Low maintenance
  • Low cost
  • Easy to scale

The biggest benefit by far is that you only need to worry about your code and nothing else. And the low maintenance is a result of not having any servers to manage. You don’t need to actively ensure that your server is running properly or that you have the right security updates on it. Hence, You deal with your own application code and nothing else.

The main reason it’s cheaper to run serverless applications is that you are effectively only paying per request. So when your application is not being used, you are not being charged for it. Let’s do a quick breakdown of what it would cost for us to run our note taking an application. We’ll assume that we have 1000 daily active users making 20 requests per day to our API and storing around 10MB of files on S3.

Finally, the ease of scaling is thanks in part to DynamoDB which gives us near infinite scale and Lambda that simply scales up to meet the demand. And of course, our frontend is a simple static single page app that is almost guaranteed to always respond instantly thanks to CloudFront.

So, now focus on our main topic migrate existing express app to AWS serverless architecture


  1. AWS account
  2. S3 Bucket
  3. Little bit idea about AWS API Gateway
  4. Your Express app

Step 1. Install the Express on Serverless node module

Express on Serverless this module allows you to run Node.js express on AWS Lambda, using Serverless framework. To install use the following command, the process is same as you install other modules.

npm install express-on-serverless

Step 2. Modify handler.js file

app = require('./app.js');
exports.index = require('express-on-serverless')(app);

Step 3. Modify serverless.yml file

service: aws-nodejs

  name: aws
  runtime: nodejs6.10

    handler: handler.index
      - http: any {proxy+}

Step 4. Configure AWS Credential

Check here to Set up AWS Credentials and Region for Development. If you have Linux environment you can check in the following file

nano ./aws/credential

Step 5. Deploy the Express app

sls deploy or serverless deploy

Now you can access https://API_GATEWAY_HOST/dev/{any route you have created} ! It’s too easy!!

For example, We have followed the MVC approach to develop our application so how to access the app check in the following:
Our directory structure:

Our app.js file

//add the following code 
//Separating router form main app file
app.use('/api/manager', manager);

Our router file inside route directory

var express = require('express');
var router = express.Router();
// Require controller modules
var manager_controller = require('../controllers/manager');

/* Manager Account Operation. */
router.get('/list', manager_controller.manager_list);'/', manager_controller.add);
router.put('/', manager_controller.edit);
router.delete('/:id', manager_controller.delete);

//export all routes
module.exports = router;

Our controller file inside controllers directory

 * Manager Controllers
var util = require('util');
var managerAccountModel = require('../models/managerAccount');
var helper = require('../middlewares/helper').user;

// Display list of all managers
exports.manager_list = function(req, res, next) {
    res.send('NOT IMPLEMENTED: managerlist goes here list');

So, after deploying the app access URL will be like this


In conclusion, there are lots of node module and approaches available to migrate existing express app on AWS serverless. Most of all are buggy due to more module dependency or complex steps to use. After going through two or more. Finally, at the end, we found express-on-serverless, which is well suited for us and hope this also works for you like a charm. Feel free to ask any query, glad to hear from you.

Further reading


AWS Mobile SDK – The quickest way to build a AWS mobile app

The AWS Mobile SDK makes it easy for the app to directly access AWS services such as Amazon Lambda, S3, DynamoDB, Mobile Analytics, Machine Learning, Auto Scaling, etc. It supports iOS, Android, Xamarin, React Native, and Unity apps. In this article, we are mainly going to discuss steps to build AWS mobile app for Android using AWS Mobile SDK. The AWS Mobile SDK for Android is an open-source software development kit distributed under the Apache Open Source license. Amazon Web Services(AWS) is a Cloud services platform which provides a simple way to access servers, storage, databases and a broad set of application services over the Internet. It also owns and maintains the network-connected hardware required for these application services, while you provision and use what you need via a web application.

Steps to use AWS Mobile SDK in your Android mobile app

Step 1: Include the SDK in your AWS mobile app

There are two options to include the AWS mobile SDK in your project:

Option 1: Importing .jar file in the project from the link- Mobile SDK. Now you can drag .jar files for the individual services your project will use into the apps/libs folder. They’ll be included on the build path automatically. Then, sync your project with the Gradle file.

Option 2: Importing gradle file into your Gradle.

    dependencies {
        compile 'com.amazonaws:aws-android-sdk-core:2.2.+'
        compile 'com.amazonaws:aws-android-sdk-s3:2.2.+'
        compile 'com.amazonaws:aws-android-sdk-ddb:2.2.+'

You can compile as much file your project needs. Then now sync your gradle.

Step 2: Set permission in your Manifest file of project

Add the following permission to your AndroidManifest.xml

    <uses-permission android:name="android.permission.INTERNET" />

Step 3: Get your AWS Credentials

Now the next step is to create an account in your AWS console. After creating an account in AWS Console you will get your authentication credentials. You can use it to access your AWS data in your app. For example, to access your EC2 like below:

   AmazonEC2Client ec2Client = new AmazonEC2Client(getCredentials());
   DescribeInstancesResult eC2value = ec2Client.describeInstances();
   DescribeRegionsResult eCRegion = ec2Client.describeRegions();

Here getCredential() method returns the credentials provided by you to authenticate yourself. If the credentials will be authenticated you will get EC2 data otherwise it will throw an exception. In the same way, you can retrieve the other AWS resources like S3, DynamoDB, RDS etc.

You can also obtain AWS Credentials using Amazon Cognito Identity as your credential provider. Using a credentials provider allows your app to access AWS services without having to embed your private credentials in your application. This also allows you to set permissions to control which AWS services your users have access to.

To use Amazon Cognito, you must create an identity pool. An identity pool is a store of user identity data specific to your account. Every identity pool has configurable IAM roles that allow you to specify which AWS services your application’s users can access.


There are various advantages and benefits of AWS like- Trade capital expense for variables expense, benefit from massive economies of sales, increase in speed and agility etc. So after looking these all benefits having a mobile app to access the AWS resources can be proved to be very helpful and important. To use the AWS Mobile SDK in your AWS mobile app is quite easy. I hope this article would help to start your project to use AWs data in your app :-).


AWS : It features various types of resources to help us learn about the services and features AWS has to offer and get started with building our solutions faster.

AWS Mobile SDK: It includes libraries, code samples, and documentation for different mobiles platforms so we can build apps that deliver great experiences across devices and platforms.

Further Reading

Easy steps to install Magento2 with PHP7 on Amazon EC2 ?

If you are planning to run an online store for your products, the first choice you get from eCommerce experts would be Magento eCommerce store. Magento is vastly used eCommerce platform by online sellers. Once Magento platform has been chosen next challenge to select hosting where your store run smoothly. Amazon EC2 is most preferred hosting service and widely used by eCommerce store owners. But how can someone install Magento2 on amazon EC2 without any technical help? The answer is to use the Amazon Machine image(AMI) which has in built environment for Magento.

It has been a long time since Magento2 is being launched and now it is replacing Magento1.9.x with powerful features. It’s easier to use Magento2 with php5.6.x and apache2 but the problems occur when you want to install Magento2 latest third party plugins.

is there any issue to install Magento2 plugins or it’s a rumor by Magento2 critics ? Well, it’s not a rumor and we faced this issue while working on one of our clients. In this article, we are going to cover these issues for Magento2 server configuration and explore the available solutions.

How did it happen ?

We used scalable Magento2 stack AMI for an online store with the following configuration.

It works quite well for default Magento2 but things went nasty when we try to install the one of the latest plugins.



What were our key issues finding ?

1. PHP version : There is not standard binary for php7 for Amazon Machine at the time of writing and we faced following issues:

  • Should it would be compatible with currently running apache server. We tried to install php7-php module from remi but it’s not compatible with Apache 2.4
  • Are all required PHP extensions by Magento available ?

2. PHP extensions :
It is very easy to install PHP extension but consider when one of the extension have library dependencies.

  • How did we find the correct version of the library ?

There may be lots of available libraries, we only have the options to hit and try.
3. Setting Cron for Magento : This is also quite complex like permission fight, the issue with default php.ini path after installing PHP7 etc.

How do we solve these problems and install Magento2 on the Amazon Ec2 with Php 7?

1. Install PHP7.0.7 from Remi

Remove php if available

sudo yum remove php5*

Add the following repository:

sudo yum install remi-release-6.rpm

Edit the following file

/etc/yum.repos.d/epel.repo and set enabled=1

Execute the following command

sudo yum upgrade -y
sudo yum install php70

Install the following extension

sudo yum install php70-php-fpm
sudo yum install php70-php-xml
sudo yum install php70-php-pdo
sudo yum install php70-php-mysqlnd
sudo yum install php70-php-pdo
sudo yum install php70-php-imap
sudo yum install php70-php-intl
sudo yum install php70-php-zip
sudo yum install php70-php-pecl-apcu
sudo yum install php70-php-mbstring
sudo yum install php70-php-mcrypt
sudo yum install php70-php-opcache

To add php70-php-gd extension execute the following command

sudo rpm -ivh
sudo yum install php70-php-gd

Start the fpm daemon manually each time server rebooted

sudo /etc/init.d/php70-php-fpm start

or to auto start PHP on reboot use the following

sudo chkconfig php70-php-fpm on

Switch Apache from prefork to mpm event process (this is required because mod_php isn’t thread safe) in /etc/httpd/conf.modules.d/00-mpm.conf

LoadModule mpm_event_module modules/

and comment others.

Instruct apache to pass all php requests to php-fpm by adding the following lines in /etc/httpd/conf/httpd.conf

<FilesMatch .php$>
SetHandler "proxy:fcgi://"
DirectoryIndex /index.php index.php

Restart apache using sudo service httpd restart. If everything went ok you should be able to verify the installation by requesting a php file containing phpinfo().

If you have existing shell scripts that use php’s cli interpreter and thus start with #!/usr/bin/php, you have to set up a symlink to /usr/bin/php since the binary is now named /usr/bin/php70. You can do this as follows:

sudo ln -s /usr/bin/php70 /usr/bin/php

2. Install Magento2

Now download Magento and upload it to /var/www/html/ or you can install using composer.
Give appropriate permission of app/etc, cron.php and var/ directory.
create a cron tab for Magento user eg.

sudo crontab -u ec2-user -e

add following lines:

* * * * * sudo /usr/bin/php70 -c /etc/opt/remi/php70/php.ini /var/www/html/bin/magento cron:run >> /var/www/html/var/log/magento.cron.log
* * * * * sudo /usr/bin/php70 -c /etc/opt/remi/php70/php.ini /var/www/html/update/cron.php >> /var/www/html/var/log/update.cron.log
* * * * * sudo /usr/bin/php70 -c /etc/opt/remi/php70/php.ini /var/www/html/bin/magento setup:cron:run >> /var/www/html/var/log/setup.cron.log

now check:

sudo crontab -u ec2-user -l

restart cron service

sudo service crond restart

A Quick and Easy Solution : Magento2 stack AMI on AWS

If you are a non-technical person, want to setup your eCommerce website in few minutes ? Use Magento2 stack AMI and run your online store in few minutes on amazon cloud. It is ready to run, prebuilt environment setting for scalability, performance and high availability for your online store.
Following are the some vendor that provide Magento Stack AMI on AWS:
Magento2 stack AMI – by iPragmatech
Deploy Magento on Amazon Web Services by Bitnami


In this article, we have described the common issue of Magento2 server configuration which is quite a time taken as well. Though there may be more configuration issue which someone face and may take few hours to a day. We suggest using easy to go solution which saves time and effort instead of configuring everything by your own.


Magento Cron Setup
Install PHP 7 on EC2 t2.micro Instance running Amazon Linux Distro

Portfolio Items