Tuesday, May 21, 2019


Process To Test AWS Route53 ChangeResourceRecordSets API from Terminal

Here I am explaining the process to test AWS Route53 ChangeResourceRecordSets API from terminal. you can copy below code and paste that to your terminal then hit an ENTER from your keyboard.

image credit: google images

aws route53 change-resource-record-sets --hosted-zone-id ZGN2AVYTFOA --change-batch '
 "Comment": "",
 "Changes": [
     "Action": "CREATE",
     "ResourceRecordSet": {
       "Name": "anilc.phpmypassion.com.",
       "Type": "A",
       "AliasTarget": {
          "HostedZoneId": "Z26RNL4JYFTOTI",
          "DNSName": "phpmypassion-event-0ddd4472d016751e.elb.us-east-1.amazonaws.com",
          "EvaluateTargetHealth": false
"HostedZoneId": "Z26RNL4JYFTOTI" :- This hosted zone id belongs to Load Balancer Id 

--hosted-zone-id ZGN2AVYTFOA :-  This hosted zone id belongs to route53 hosted zone Id.


    "ChangeInfo": {
        "Id": "/change/CWR0FJ4AJNZFT",
        "Status": "PENDING",
        "SubmittedAt": "2019-05-21T10:06:47.123Z",
        "Comment": ""

Read More

Tuesday, March 19, 2019


Process to Create AWS Lambda Function in Python

In this article I am sharing the full and easy process of creating a AWS lambda function using python. So follow step by step process for creating AWS lambda using python 3.

********************** process to create lambda function *****************

> fill basic information like name, runtime envoironment, permission

*******************Now add Trigger from the list in left side****************

In my case I used s3

-> configure triggers basics
-> select s3 Bucket [bucket name on that you want run your trigger]
-> event type {like bucket PUT}
-> prefix
-> suffix [like - .csv, .jpg]

Note:- you can either choose prefix or suffix.

*********************** upload your lambda function code ****************

In my case I had to produced csv name as massage that was automatically uploaded in s3 bucket by python program for athena query result. So I produced massage by kafka producer on s3 put event trigger.

For setting up kafka producer on lambda trigger I did below steps..

#Install kafka in your local machine and create .zip of whole package.

Follow below steps:-

# https://pypi.org/project/kafka-python/

# pip install kafka-python -t <FOLDER PATH>
# pip install requests -t <FOLDER PATH>

* Please Install the Kafka Python & Request Libs in the same folder for it to  work
    in AWS Lambda.
* When Uploading on AWS Lambda, zip the entire folder. 
   If you are uploading a zip file.  Make sure that you are zipping the contents 
   of the directory and not the directory itself.  
   Else you will get an Error: aws lambda unable to import module

zip -r ../build_dmp_dashboard_from_athena.zip *


#Now you have to create a file for produced massage on trigger as below -

Lets create a file first with name lambda_producer.py

from __future__ import print_function
import sys, time
import json
import requests
from kafka import KafkaProducer

########################### Configurations #############################################
email_api = 'https://api.phpmypassion.com/api/send-mail'
email_to = '[email protected]'

kafka_server_cluster = ['cluster1', 'cluster2', 'cluster3'] 
topic = "athena_query_result"

Send Email
def sendmail(message):
  msg = ("{}:{}".format("lambda_producer", message))
  data = {
    'toEmail': email_to,
    'mailSubject': msg,
    'mailBody': msg

  return requests.post(email_api, data)

Producer Connection
exception_counter = 0
for x in range(3):
    producer = KafkaProducer(bootstrap_servers=kafka_server_cluster)

  except Exception as e:
    exception_counter = exception_counter + 1
    print("Error in Connection to kafka cluster: {0}".format(kafka_server_cluster))
    time.sleep(2)  # 2 Sec

if (exception_counter >= 2):
  message = ('Error: Looks like connection to kafka cluster failed on lambda producer: {0}'.format(kafka_server_cluster))
  response = fnc_sendmail(message)
  print("Email Sent Response: {0}".format(response))
  print("Program exiting")

Lambda Handler. Received the trigger from the Code Commit Repositoy of AWS and  Produces
a message on kafka Queue
def lambda_handler(event, context):
  print("Received event : " + json.dumps(event, indent=2))

  # Parse the S3 Trigger and Get the CSV Path & File from the "event"
  key_file_name = (event['Records'][0]['s3']['object']['key']) 

  print("We have a new file. file name & path : ", key_file_name)
    producer_ret_val = producer.send(topic, key_file_name.encode())  # If Topic Does not Exist then it get created automatically by python Producer
    record_metadata = producer_ret_val.get(timeout=10)
    print("sent event to Kafka! topic {} partition {} offset {}".format(record_metadata.topic, record_metadata.partition, record_metadata.offset))

  except Exception as e:
    print("Some Error in lambda_producer")
    raise e

#Now Lets setup thin within Aws lambda:-

-> choose "code entry type" and select the option to upload .zip

-> upload your zip file and click to save

-> your lambda code has been uploaded.

-> in my case I uploaded kafka configuration zip folder

#You have to also change lambda handler function too. as you are using lambda_handler function within lambda_producer.py file so you have to fill "lambda_producer.lambda_handler" within the above input box [shows in right red line above image].

#Now click on save button.

You done the all steps.

Read More

Sunday, March 17, 2019


How to get data from redshift table as dictionary in python

The dict cursors allow to access to the retrieved records using an interface similar to the Python dictionaries instead of the tuples.

Add following code into your cursor function


After adding the above code your cursor should be look like this -

cur = con.cursor(cursor_factory=psycopg2.extras.RealDictCursor)

if you have an error of not found class .extra -
AttributeError: module 'psycopg2' has no attribute 'extras'

Resolved this error by importing class as below

import psycopg2
import psycopg2.extras

Now if you run this code, you will get data in dictionary from redshift  as key, value pair like below :-

{'id': 1234567, 'date': datetime.date(2019, 2, 13), 'hour': 0, 'code': 42107, 'count': 6, 'revenue': None, 'payout': None, 'week': 7}

Read More

Saturday, March 16, 2019


Pandas- ImportError: Missing required dependencies ['numpy']

This is a common error that sometimes occur when importing pandas library. Here I am sharing the solution for that. So please follow as I described below and if you found any difficulty, intimate me by your comment.


>>> import pandas
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/anil/anaconda3/lib/python3.6/site-packages/pandas/__init__.py", line 19, in <module>
    "Missing required dependencies {0}".format(missing_dependencies))
ImportError: Missing required dependencies ['numpy']


check all folder list in your python lib folder

-> ls -la

Now if you see numpy folder in that. then run below command :-

sudo pip3 uninstall numpy

Now check again folder list with -

-> ls -la

Try by importing pandas -

[email protected]:~/.local/lib/python3.6/site-packages$ python3
Python 3.6.4 |Anaconda, Inc.| (default, Jan 16 2019, 18:10:19)
>>> import pandas

you will not found this error now.

Read More

Process to Change MySql ip-Address to Listen for all IP's

*************** process to change MySql ipaddress to listen for all ip ********

-> update user set Host="%" where User="phpmyadmin";

-> sudo vi /etc/mysql/mysql.conf.d/mysqld.cnf [change "bind-address ="]

-> flush privilage

-> check by netstat -nlp | grep 3306 [ouput should be start listen on]

-> telnet 3306 [now check its connecting or not with the ip]
Read More

Friday, March 15, 2019


Safest Way to Merge A Git Branch Into Master

In this article I am going to explain you about the safest way to merge a git branch(stage branch) into master branch.

Git Merge:-

Never forget to take a pull from your master branch after pushing your all new changes to your current git branch.

If your current branch(stage branch) is up to date and now you want to merge your current branch (stage branch) with master branch then follow below process.

first checkout to master branch then merge with below command

-> git pull origin <your-current-branch-name> [git pull origin master]
-> git merge <your-previous-branch-name> [git merge stage]
-> git push origin <your-current-branch-name> [git push origin master]
-> checkout your <your-previous-branch-name> [git checkout stage]

you done the all steps.
Read More

Top Mostly Used Git Commands

************************** Git clone a Repository ******************

git clone <repository_url>

************************ Check git branch *********************

git branch

************************ Git Create Branch *************

Git checkout -b <branch-name>

************************ Checkout to another branch *************

Git checkout <branch-name>

************************ Make your file same as git repository *************

Git checkout <file-name>

-> for all files :-

Git checkout .

************************* Git commit command *********************

git add . [for all files]

git commit -m "your comment"

************************* Git push command **********************

git push origin <your-branch-name>

*********************** Git pull command ************************

git pull origin <your-branch-name>

*********************** Git delete branch command ***************

git branch -d <your-branch-name>

************************** Git check current repository URL command ***********

-> git remote -v

-> git remote show origin

**************************** Git change repository remote URL command ************
$ git remote set-url origin [email protected]:USERNAME/REPOSITORY.git
Read More

Thursday, March 14, 2019


List of Top Docker Commands in Linux

Docker works just like Git.

you have to always take a pull for latest image.

****************************** Docker install ******************

sudo apt-get update
sudo apt-get install docker.io

**************************** Check docker installed or not *************

docker ps -a

************************* Login to AWS container via command line ************

 $(aws ecr get-login --no-include-email --region us-east-1)
[--registry-ids <value> [<value>...]]
[--include-email | --no-include-email]

**************************** pull the docker image *************

docker pull your-server.us-east-1.amazonaws.com/aws_repository

************************** Docker push image **************************************

-> commit your changes first

docker commit <your-container-name> your-server.us-east-1.amazonaws.com/aws_repository:v2.2 (v2.2 is the tagging  version)

-> Now run push command

docker push your-server.us-east-1.amazonaws.com/aws_repository:v2.2

you can check your latest push on server [aws/ECR/Repository/click on your repository]

*********************************** Create container with in docker *********************

docker run -it --restart always --net=subnet15 --hostname=<your-container-name> --name <your-container-name> your-server.us-east-1.amazonaws.com/aws_repository

**************************** Enter into docker container *************

-> docker exec -it <docker-container-name> bash

-> docker attach <docker-container-name>

****************************** To check docker container *******
-> docker ps

-> su md (switch to md user)

*************************** Check all cron in docker *****
crontab -e

******************** Install library into lib folder ***********************
pip3 install name -t . (within your python library path)

*************************** List Docker CLI commands **************
docker container --help

*************** Display Docker version and info *************
docker --version
docker version
docker info

************ Execute Docker image **********
docker run hello-world

******************* List Docker images ***************
docker image ls

******************* List Docker containers (running, all, all in quiet mode) *****************
docker container ls
docker container ls --all
docker container ls -aq

************************* list docker all container*********************
docker container ls -a
Read More

Wednesday, March 13, 2019


How To Remove Docker Images ?

Removing Docker Images

Remove one or more specific images

Use the docker images command with the -a flag to locate the ID of the images you want to remove. This will show you every image, including intermediate image layers. When you've located the images you want to delete, you can pass their ID or tag to docker rmi:
  • docker images -a
  • docker rmi Image Image
Read More

Sunday, February 10, 2019


Difference Between DDL and DML?


DML statements are SQL statements that manipulate data. DML stands for Data Manipulation Language. The SQL statements that are in the DML class are INSERT, UPDATE and DELETE. Some people also lump the SELECT statement in the DML classification.
DML example SQL statements are below -
  • SELECT – retrieve data from the a database
  • INSERT – insert data into a table
  • UPDATE – updates existing data within a table
  • DELETE – Delete all records from a database table
  • MERGE – UPSERT operation (insert or update)
  • CALL – call a PL/SQL or Java subprogram
  • EXPLAIN PLAN – interpretation of the data access path
  • LOCK TABLE – concurrency Control


Data Definition Languages (DDL) are used to define the database structure. Any CREATE, DROP and ALTER commands are examples of DDL SQL statements.
DDL example SQL commands are below -
  • CREATE – to create database and its objects like (table, index, views, store procedure, function and triggers)
  • ALTER – alters the structure of the existing database
  • DROP – delete objects from the database
  • TRUNCATE – remove all records from a table, including all spaces allocated for the records are removed
  • COMMENT – add comments to the data dictionary
  • RENAME – rename an object
Read More

Friday, February 8, 2019


How To Integrate Facebook PHP SDK With Laravel 5.4

In this article I am explaining a simple process to setup Facebook Marketing SDK by PHP Artisan Command with laravel 5.4. You can get campaigns or your ad account data either setting up a cron using created command or running command on terminal.

First, edit composer.json in the project's root folder to include the Facebook SDK:

{    "require": {
        "facebook/php-business-sdk": "3.1.*"    }

Next run composer update at shell prompt to pull in the sdk to the vendor folder.

php composer.phar install --no-dev

If you do not have composer.phar in your Laravel project just copy that file to your project from Facebook Business SDK for PHP

Now we would like to use the Facebook SDK within our app as a command. But before doing that, let us setup our app_id, app_secret and default_graph_version which are parameters required while making requests to Facebook API. You can obtained app_id and app_secret from your Facebook App Settings.

Once we have these credentials from Facebook, we would now edit .env file in the project's root folder. Add them to the end:


Replace the xxx.. with the values provided to you. Note that the variable names are just my own creation. You can name them whatever you'd like. We would now have to use these variables to setup a separate config file. We need to do this so that we can use Laravel's config() helper function to retrieve the values wherever we want within the app. So let's create facebook-ads.php in the config folder and add the following:

return [
    'app_id' => env('FB_ADS_APP_ID', null),
    'app_secret' => env('FB_ADS_APP_SECRET', null),
    'default_graph_version' =>
env('FB_DEFAULT_GRAPH_VERSION', 'v2.8'),];

Now we have to create a class FacebookAdsService.php with below command..

php artisan make:console FacebookAdsService

namespace App\Console\Commands;
use Illuminate\Console\Command;
class FacebookAdsService extends Command
    * The name and signature of the console command.
   protected $signature = 'facebook_sdk:service';
     *The console command description.
     * @var string 
   protected $description = 'Facebook ads api for campaign 
                            service, adset etc';    /** 
     * Create a new command instance.
     * @return void
   public function __construct(){
    * Execute the console command.
    * @return mixed 
   public function handle()    {
      //Your Code Goes Hare

You have to write your all code to get facebook campaign & their reporting in the handle() function.

Now to run this command on our terminal we have to register it in Kernal.php with path "/app/Console/kernel.php".

So add "Command\FacebookAdsService::class" in the $commands array as below -

protected $commands = [    

Now we have all setup to get campaign or adset reporting from facebook marketing api.

Now just add some classes at the top of our FacebookAdsService class that we gonna use to get campaign data.

use FacebookAds\Object\AdAccount;
use FacebookAds\Api;
use FacebookAds\Object\AdSet;
use FacebookAds\Object\Fields\AdSetFields;

Now write your code to get facebook campaign data in its handle() function as below -

We have to instantiate Api object first to make an facebook api call.

To instantiate an Api object you will need a valid access token:

$app_id = env('FB_ADS_APP_ID');
$app_secret = env('FB_ADS_APP_SECRET');
$access_token =  env('FB_ACCESS_TOKEN');
Api::init($app_id, $app_secret, $access_token);
$api = Api::instance();

Get adAccount details from object -

$account_id = 'act_XXXXXXXX';
$account = new AdAccount($account_id);

Get All Campaigns -

$fields = array(  'name',  'objective',  'start_time',);
$params = array(  
      'effective_status' => array('ACTIVE','PAUSED'),
$r = json_encode($account->getCampaigns(
     )->getResponse()->getContent(), JSON_PRETTY_PRINT);

Get All AdSet -

$fields = array(  
); $r = json_encode($account->getAdSets( $fields )->getResponse()->getContent(), JSON_PRETTY_PRINT);

Get Ad Insight -

You can change fields and parameters according to your need.
Check All Ads Insights Parameters and Fields

$fields = array(  
$params = array(  
     'time_increment' => '1',  
     'date_preset' =>  'lifetime',  
     'breakdowns' => array( 
     'sort' => array(

$cmp_id = XXXXXXXXX;

$data = json_encode((new AdSet($cmp_id))->getInsights(
     )->getResponse()->getContent(), JSON_PRETTY_PRINT);

Run your command on terminal -

php artisan facebook_sdk:service

You can see your output on the terminal.

If you found any difficulty in setting up Facebook Sdk with laravel, you can intimate me by comment.  
Read More

Monday, February 4, 2019


Laravel Advanced how to pass variable into nested where function?

Whenever you want to use a variable inside the Closure scope, you need to use the "use" keyword to pass the variable into the Closure:

foreach ($user->locations as $location) {  
        $r = TableName::where('id', '<>', $this->id)    
            ->where(function ($q) use ($code) { // SEE HERE
                      $q->where('name', $code)        
                      ->orWhere('alias', $code); })

if you have an array variable like - $requestParam['name'] .

you have to define it before the query statement -

$name = $requestParam['name'];
$lastRecordResult = YrModel::where('type', self::table)
  ->where(function ($q) use ($name) {    
                $q->where('name', $name)      
                ->orWhere('alias', $name);  })  
  ->where('t_id', $requestParam['t_id'])  
  ->where('v_id', $requestParam['v_id'])  

Read More

Wednesday, December 26, 2018


Get A Insert Query Statement For All Rows in MySql

This is a common thing that usually required for back end programming.  Sometimes we need all rows of a table as an query statement just like -

INSERT INTO MyTable(ID,Col1,Col2,Col3) VALUES (100,'some vlaue','some value','2018-10-20');

So the question is "How do we get the insert query statement for this easily ?"

Here I am gonna write a simple PHP function by that you can get the insert query statement for all rows in a table :-

function makeInsertQuery($mysqli,$table, $where=null) {
    $sql="SELECT * FROM `{$table}`".(is_null($where) ? "" : " WHERE ".$where).";";

    foreach ($result->fetch_fields() as $key=>$value) {

    while ($row=$result->fetch_row()) {
        foreach ($row as $key=>$value) {
            $temp[$key]=($value===null ? 'NULL' : "'".$mysqli->real_escape_string($value)."'");
    return "INSERT `{$table}` (".implode(",",$fields).") VALUES \n".implode(",\n",$values).";";

When you pass tableName to @makeInsertQuery($tableName) function. You will get below output..

INSERT INTO `tableName` (`id`, `parent_id`, `name`, `type`) VALUES
(1103, 1019, 'ios11.2.2', 'os'),
(1104, 1019, 'ios11.2.5', 'os'),
(1105, 1017, 'iosos_version', 'os'),
(1106, 1019, 'ios11.0.1', 'os'),
(1107, 1018, 'ios10.2.1', 'os'),
(1108, 1019, 'ios11.1.1', 'os'),
(1109, 1025, 'ios9.1', 'os'),
(1110, 1019, 'ios11.0.3', 'os'),
(1111, 1019, 'ios11.2.1', 'os'),
(1112, 1019, 'ios11.1.2', 'os'),
(1113, 1018, 'ios10.3.1', 'os'),
(1114, 1018, 'ios10.3.3', 'os'),
(1115, 1019, 'ios11.2.6', 'os'),
(1116, 1019, 'ios11.2', 'os'),
(1117, 1018, 'ios10.1.1', 'os');

So using above function you can get insert query statement for a table in MySql.

You can also get insert query statement by dumping or exporting the table into .sql file. So you can dump your table data with executing below command on terminal.

sudo mysqldump -u USERNAME -p DATABASE tableName > tableName.sql;

Windows users can export from MySql workbench:-

If you open the .sql file into editor, you will find query statement same as above. So just copy the all insert query statement and execute where you want.
Read More

Wednesday, December 19, 2018


MySQL Group By Day, Month Or Year

MySQL Group By Day, Month Or Year

Most frequently occurred problem that is faced by developers.  Here I am gonna explain you for the query Group by Day, Month or Year in MySQL.

Using the DATE_FORMAT operator, you can easily group the timestamp, date or datetime column using any format you want.

For example, I needed to group rows that were added on the same day. Here is my query:
select count(*), DATE_FORMAT(created_at,"%Y-%m-%d") as created_day FROM widgets GROUP BY DATE_FORMAT(created_at,"%Y-%m-%d")

This query will give you result like this :
count(*) | created_day
126 | 2012-04-12
168 | 2012-04-13
169 | 2012-04-14
189 | 2012-04-15
187 | 2012-04-16
131 | 2012-04-17

Similarly Group by month:

select count(*), DATE_FORMAT(created_at,"%Y-%m") as created_month FROM widgets GROUP BY DATE_FORMAT(created_at,"%Y-%m")

Similarly Group by year:

select count(*), DATE_FORMAT(created_at,"%Y") as created_year FROM widgets GROUP BY DATE_FORMAT(created_at,"%Y")

Read More

Friday, November 30, 2018


How To Setup Cron Job on AWS

Cron job is a process to run a page automatically in background without interrupting user. You can execute a page automatically at any time on daily basis by setting up a cron job.

How To Setup Cron Job on AWS

       I have already explained about setting up a cron job on godaddy server in one of my article. So now here I am going to explain a step by step process for setting up cron job in AWS too for ubuntu users.
Process to Setting Up AWS Cron:-
Step #1. Login to your AWS instance through Terminal by using your server credentials.
Step #2. Type crontab -e.
Step #3. Now editor will open inside your terminal that will look like below..

How To Setup Cron Job on AWS

Step #4.  You have to add your script command inside it as below.

* * * * * php /var/www/html/myscript.php
Here 5 star shows -

1st * is for minute (0-59)
2nd * is for hour (0-23)
3rd * is for day (1-31)
4th * is for month (1-12)
5th * is for day-of-week (0-7[where both 0 and 7 mean Sun, 1 = Mon, 2 = Tue, etc])

After setting cron time in min, hour, day, month, day-of-week we set our file path which we want to execute automatically according to date & time.

You can edit cron by pressing Insert key from keyboard and to paste your command inside editor you have to press ctrl+shift+v.
Now press esc and after that type :wq to save and exit from editor.
Step #5. Now check your cron is save successfully or not by typing crontab -l in terminal.
Step #6. Make sure you change your file permission executing command chmod 755 /var/www/html/myscript.php in terminal. 

$fr_email = '[email protected]';
$subj = 'Hi this is from AWS Cron';
/* Create a simple msg body */
$body = "Welcome to myscript\n";
$body .= "\n";
// Now send email
mail($to,  $subj, $body, "From: <$fr_email>");

You will get an email every time when your cron will run as per your above php script.

More About Cron Job:

Read More