Thursday, November 28, 2019

phpMyPassion

Difference Between React and Angular - AngularJS v/s ReactJS

Many of web developers are struggling for choosing good option for frontend. there is confusion in choosing AngularJS or ReactJS whether it is a smarter choice for front-end development. React and Angular are powerful front end development technologies. AngularJS is designed and managed by Google, while Facebook and its community maintain ReactJS. They are built on JavaScript that makes advancement and customisation to develop advanced front end of web applications.

According to my experience, here I am mentioning some of the comparison points between AngularJS and ReactJS. I hope this will be helpful for front end web developers.

Angular :

  • It is a Framework.
  • Angular is best for large scale web applications.
  • Angular use two directional data flow process where its updated the real DOM directly.
  • Its difficult for learning to a beginner.
  • Angular is easy to set up but it takes time to deliver project.
  • Backward compatibility issue in its versions below Angular 2.
  • It does not has flex architecture inbuilt. angular uses observable for data store.
  • Angular is slow and high weighted compare to React. 
  • It has solid community support. 

React :

  • It is a JavaScript Library
  • React is perfect option of small and medium scale type of web application.
  • React updates only the virtual DOM and is deal with the one-directional data flow.
  • React learning curve is easy for beginner.
  • React takes longer to set up than Angular but it takes less time to deliver projects. React lets you create projects and build apps relatively quickly.
  • Its all versions are completely backward compatible with all its previous versions.
  • React fully support flex architecture. It has data store ability inbuilt.
  •   React is Fast and light weighted compare to Angular.
  • React is behind in community contribution and collaborations.
If you have any type of query, please intimate me by your comment.
Read More

Monday, November 25, 2019

phpMyPassion

How To Remove Last Character From A String Using jQuery?

Common problem faced by developers, how to remove last character from a string using jQuery, Here I am going to explain the solution with an example.


var str = '123-4';
alert(str.slice(0, -1));
Read More
phpMyPassion

Difference between TRUNCATE , DELETE and DROP in MySql Server

TRUNCATE
  • TRUNCATE is a DDL(Data Manipulation Language) command
  • TRUNCATE is executed using a table lock and whole table is locked for remove all records.
  • We cannot use Where clause with TRUNCATE.
  • TRUNCATE removes all rows from a table.
  • Minimal logging in transaction log, so it is performance wise faster.
  • TRUNCATE TABLE removes the data by deallocating the data pages used to store the table data and records only the page deallocations in the transaction log.
  • Identify column is reset to its seed value if table contains any identity column.
  • To use Truncate on a table you need at least ALTER permission on the table.
  • Truncate uses the less transaction space than Delete statement.
  • Truncate cannot be used with indexed views.
DELETE
  • DELETE is a DML(Data Definition Languages) command.
  • DELETE is executed using a row lock, each row in the table is locked for deletion.
  • We can use where clause with DELETE to filter & delete specific records.
  • The DELETE command is used to remove rows from a table based on WHERE condition.
  • It maintain the log, so it slower than TRUNCATE.
  • The DELETE statement removes rows one at a time and records an entry in the transaction log for each deleted row.
  • Identity of column keep DELETE retain the identity.
  • To use Delete you need DELETE permission on the table.
  • Delete uses the more transaction space than Truncate statement.
  • Delete can be used with indexed views.
DROP
  • The DROP command removes a table from the database.
  • All the tables' rows, indexes and privileges will also be removed.
  • No DML(Data Definition Languages) triggers will be fired.
  • The operation cannot be rolled back.
  • DROP and TRUNCATE are DDL commands, whereas DELETE is a DML command.
  • DELETE operations can be rolled back (undone), while DROP and TRUNCATE operations cannot be rolled back.
Read More
phpMyPassion

How to connect with redshift in python


import psycopg2
import psycopg2.extras





def redshif_connect(): 
  ####### connection and session creation ##############  
  print("redshif_connect start......")  
  conn = psycopg2.connect(dbname="RS_DB_NAME", host="HOST_NAME", 
                    port="RS_PORT", user="USER_NAME", password="PWD")  
  print('redshif connection variable', conn)  
  self.cursor = conn.cursor(cursor_factory=psycopg2.extras.RealDictCursor)  

Read More
phpMyPassion

Set Library Path for python running configuration

******************** Set Library Path for python running configuration *****

-> home/user/.bashrc

Add below code in your library path

export PATH="/home/user/anaconda3/bin:$PATH"

export PYTHONPATH="/home/user/anaconda3/bin/md"
Read More
phpMyPassion

How to Convert a Python Dictionary to List


 >>> from functools import reduce
 >>> a = {'foo': 'bar', 'baz': 'quux', 'hello': 'world'}
 >>> list(reduce(lambda x, y: x + y, a.items()))
 ['foo', 'bar', 'baz', 'quux', 'hello', 'world']

explanation
-> a.items() returns a list of tuples. 
-> Adding two tuples together makes one tuple containing all elements. Thus the reduction creates one tuple containing all keys and values and then the list(...) makes a list from that.







Read More

Sunday, June 2, 2019

phpMyPassion

How To Integrate DoubleClick Bid Manager API With Laravel 5.4

In this article I am explaining a simple process to setup Google Bid Manager (DBM) API by PHP Artisan Command with laravel 5.4. You can get campaigns or your line items data either setting up a cron using created command or running command on terminal.

First, edit composer.json in the project's root folder to include the Google Client:


{    "require": {
        "google/apiclient": "^2.0"    }
}



Next run composer update at shell prompt to pull in the sdk to the vendor folder.

php composer.phar install --no-dev


Now we would like to use the Google DBM SDK within our app as a command. But before doing that, let us setup our client_id, client_secret and access_token which are parameters required while making requests to Google DBM API. You can obtained client_id and client_secret from your Google App Console .

Once we have these credentials from Facebook, we would now edit .env file in the project's root folder. Add them to the end:

DBM_CLIENT_ID=8990-ru2amnc17f5jchhgsepp0uro96smu16b.apps.googleusercontent.com
DBM_CLIENT_SECRET=XXXXXXXXXXXXX
DBM_REFRESH_TOKEN=1/VeTb6k8T291WCKWsPfnBrjOBxtwwCxO11Q6FGH8DZJk
Replace the xxx.. with the values provided to you. Note that the variable names are just my own creation. You can name them whatever you'd like. We would now have to use these variables to setup a separate config file. We need to do this so that we can use Laravel's config() helper function to retrieve the values wherever we want within the app. So let's create google-dbm.php in the config folder and add the following:

return [
    'client_id' => env('DBM_CLIENT_ID', null),
    'client_secret' => env('DBM_CLIENT_SECRET', null),
];

Now we have to create a class GoogleDbmService.php with below command..

php artisan make:console GoogleDbmService
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
class GoogleDbmService extends Command
{   
   /**
    * The name and signature of the console command.
    */
   protected $signature = 'google-dbm:service {--date=today}';
    /** 
     *The console command description.
     * @var string 
     */ 
   protected $description = 'google ads api for campaign 
                            service, adset etc';    /** 
     * Create a new command instance.
     * @return void
     */    
   public function __construct(){
        parent::__construct();    
   }
   /** 
    * Execute the console command.
    * @return mixed 
    */    
   public function handle()    {
      //Your Code Goes Hare
    }
}

You have to write your all code to get Google campaign & their reporting in the handle() function.

Now to run this command on our terminal we have to register it in Kernal.php with path "/app/Console/kernel.php".

So add "Command\GoogleDbmService::class" in the $commands array as below -


protected $commands = [    
   Commands\GoogleDbmService::class
];


Now we have all setup to get campaign or lineitems reporting from google dbm api.

Now just add some classes at the top of our GoogleDbmService class that we gonna use to get campaign data.

use Google_Client;
use Google_Service_DoubleClickBidManager;
use Google_Service_DoubleClickBidManager_DownloadLineItemsRequest;
use Google_Service_DoubleClickBidManager_DownloadRequest;
use Google_Service_DoubleClickBidManager_FilterPair;
use Google_Service_DoubleClickBidManager_Parameters;
use Google_Service_DoubleClickBidManager_Query;
use Google_Service_DoubleClickBidManager_QueryMetadata;
use Google_Service_DoubleClickBidManager_QuerySchedule;
use Google_Service_Exception;

Now write your code to get google dbm campaign data in its handle() function as below -

We have to get access token first to make an google bid manager api call.

To instantiate an Api object you will need a valid access token:


$client = new Google_Client();
$client->setAccessType('offline');
$client->setClientId(env('DBM_CLIENT_ID'));
$client->setClientSecret(env('DBM_CLIENT_SECRET'));
$client->addScope('https://www.googleapis.com/auth/doubleclickbidmanager');
$service = new Google_Service_DoubleClickBidManager($client);
$data = $client->refreshToken(env('DBM_REFRESH_TOKEN'));
$client->setAccessToken($data['access_token']);

Download Campaigns by AdvId -

you should pass adv_id to get all campaigns -

public function downloadCmp($service, $adv_ids){
    // Setup any filtering on the API request
    $dliRequest = new
    Google_Service_DoubleClickBidManager_DownloadRequest();
    $dliRequest->setFilterType("ADVERTISER_ID");
    $dliRequest->setFileTypes(array("CAMPAIGN"));
    $dliRequest->setFilterIds(array($adv_ids));
    try {
      $result = $service->sdf->download($dliRequest);
    } catch (Google_Service_Exception $e) {
      printf('<p>Exception: %s</p>', $e->getMessage());
      print '<p>Consider filtering by ADVERTISER_ID</p>';
      return;
    } catch (Exception $e) {
      printf('<p>Exception: %s</p>', $e->getMessage());
      return;
    }
    $csv_content = $result['campaigns'];
    $csvDelimiter = '';
    $csvLines = str_getcsv($csv_content, "\n");
    $linesToImport = [];
    foreach($csvLines as $row)
    {
      $linesToImport[] = str_getcsv($row, $csvDelimiter);
    }
    return $linesToImport;
  }

Create Query -

In DBM it is necessary to create query to get data of a lineitems. So here I here is the code for creating a query.

//Create Query
  public function createQuery($service, $dateRange, $filters){
    // Setup any filtering on the API request
    $qryRequest_prm = new  Google_Service_DoubleClickBidManager_Query();
    $qryRequest = new Google_Service_DoubleClickBidManager_QueryMetadata();
    $qryRequest->setDataRange($dateRange);
    $qryRequest->setFormat('csv');
    $qryRequest->setTitle('query:'.time());
    $qryRequest->setRunning('true');
    $a = [];
    foreach($filters as $k=>$v){
      $filterPairRequest = new Google_Service_DoubleClickBidManager_FilterPair();
      $filterPairRequest->setType($k);
      $filterPairRequest->setValue($filters[$k]);
      $a[] = $filterPairRequest;
    }
    $groupBY = array(
      //"FILTER_ADVERTISER",
      //"FILTER_ORDER_ID",
      "FILTER_MEDIA_PLAN",
      "FILTER_LINE_ITEM",
      //"FILTER_BUDGET_SEGMENT_DESCRIPTION",
      "FILTER_DATE"
    );
    $prmRequest = new Google_Service_DoubleClickBidManager_Parameters();
    //$prmRequest->setFilters(["FILTER_PARTNER"]);
    $params = array(
      "METRIC_CLICKS",
      "METRIC_TOTAL_CONVERSIONS",
      "METRIC_CONVERSIONS_PER_MILLE",
      "METRIC_IMPRESSIONS",
      "METRIC_CTR",
      "METRIC_REVENUE_PARTNER",
      //'METRIC_REVENUE_ADVERTISER',
    );
    $prmRequest->setMetrics($params);
    $prmRequest->setType("TYPE_GENERAL");
    $prmRequest->setFilters($a);
    $prmRequest->setGroupBys($groupBY);
    //$prmRequest->setGroupBys();
    $schRequest = new Google_Service_DoubleClickBidManager_QuerySchedule();
    $schRequest->setFrequency('ONE_TIME');
    $qryRequest_prm->setKind('doubleclickbidmanager#query');
    $qryRequest_prm->setMetadata($qryRequest);
    $qryRequest_prm->setParams($prmRequest);
    $qryRequest_prm->setQueryId(123456);
    $qryRequest_prm->setSchedule($schRequest);
    try{
      $result = $service->queries->createquery($qryRequest_prm);
    }catch (Exception $e){
      print($e);
    }
    return $result->queryId;
  }

Delete Query -

Always delete query id after you done the all operation related to that query id.

//Delete query after operations
  public function deleteQuery($service, $queryId){
    try{
      $result = $service->queries->deletequery($queryId);
    }catch (Exception $e){
      print($e);
    }
    return $result;
  }

Download All LineItems of an Advertiser -

if you want to download all line items of an advertiser you can do it with below function.

//Download Line Items public function downloadLineItems($service, $adv_ids){ $dliRequest = new Google_Service_DoubleClickBidManager_DownloadLineItemsRequest(); $dliRequest->setFilterType("ADVERTISER_ID"); $dliRequest->setFilterIds(array($adv_ids)); try { $result = $service->lineitems->downloadlineitems($dliRequest); //print_r($result); } catch (Google_Service_Exception $e) { printf('<p>Exception: %s</p>', $e->getMessage()); print '<p>Consider filtering by ADVERTISER_ID</p>'; return; } catch (Exception $e) { printf('<p>Exception: %s</p>', $e->getMessage()); return; } if (!isset($result['lineItems']) || count((array)$result['lineItems']) < 1) { print '<p>No items found</p>'; return; } else { //print_r( $result->lineItems); $csv_content = $result->lineItems; $csvDelimiter = ''; $csvLines = str_getcsv($csv_content, "\n"); $linesToImport = []; foreach($csvLines as $row) { $linesToImport[] = str_getcsv($row, $csvDelimiter); } print '<p>Download complete</p>'; } return $linesToImport; }

Get LineItems Insight -

You can change fields and parameters according to your need.

public function handle() { if ($this->option('date') == "today") { $dateRange = 'CURRENT_DAY'; } elseif ($this->option('date') == "yesterday") { $dateRange = 'PREVIOUS_DAY'; } elseif ($this->option('date') == "last_month") { $dateRange = 'PREVIOUS_MONTH'; }elseif ($this->option('date') == "this_month") { $dateRange = 'MONTH_TO_DATE'; }elseif ($this->option('date') == "last_year") { $dateRange = 'PREVIOUS_YEAR'; }elseif ($this->option('date') == "this_year") { $dateRange = 'YEAR_TO_DATE'; }elseif ($this->option('date') == "all_time") { $dateRange = 'ALL_TIME'; } else { $this->info('Please choose date param between today/yesterday/last_month/this_month/last_year/this_year'); return; } $prefix = "DBM_"; $partner_id = 20534561234; $requestParam = []; $requestParam['account_type'] = "DBM"; $account = Account::getAccountListByAccountType($requestParam); if(count($account) <= 0) { $this->error('You have not any account please set up your account'); return; } $this->info("Total Ad Account:-" . count($account)); if($account) { foreach ($account as $key => $value) { if($value['access_token'] && $value['ext_account_id']) { $requestParam['user_id'] = $value['user_id']; $requestParam['account_id'] = $value['account_id']; $requestParam['account_name'] = $value['name']; $requestParam['ext_account_id'] = $value['ext_account_id']; $requestParam['adv_id'] = $value['adv_id']; $requestParam['manager_id'] = $value['manager_id']; $this->info("Account:-".$value['ext_account_id']); $client = new Google_Client(); $client->setAccessType('offline'); $client->setClientId(env('DBM_CLIENT_ID')); $client->setClientSecret(env('DBM_CLIENT_SECRET')); $client->addScope('https://www.googleapis.com/auth/doubleclickbidmanager'); $service = new Google_Service_DoubleClickBidManager($client); $data = $client->refreshToken($value['access_token']); $client->setAccessToken($data['access_token']); try{ $cmpByAdvId = $this->downloadCmp($service, $requestParam['ext_account_id']); $this->info('Campaign count:-' . (count($cmpByAdvId)-1)); if(count($cmpByAdvId) > 0) { foreach ($cmpByAdvId as $key_id => $cmp_value) { if ($key_id == 0) { continue; } else{ $filters = array( 'FILTER_PARTNER'=>$partner_id, 'FILTER_ADVERTISER'=>$requestParam['ext_account_id'], 'FILTER_MEDIA_PLAN'=>$cmp_value[0], ); $queryId = $this->createQuery($service, $dateRange, $filters); if ($queryId != '' || $queryId != 0) { sleep(60); #coz csv takes time to write on google cloud // Call the API, getting a list of queries. $result = $service->queries->getquery($queryId); $csv_content = file_get_contents($result->getMetadata()->getGoogleCloudStoragePathForLatestReport()); if (isset($csv_content)) { $csvDelimiter = ''; $csvLines = str_getcsv($csv_content, "\n"); $linesToImport = []; foreach ($csvLines as $row) { $linesToImport[] = str_getcsv($row, $csvDelimiter); } $this->info('Campaign Id:-'.$cmp_value[0]); $i = 0; if (count($linesToImport) > 0) { foreach ($linesToImport as $key => $value) { if ($key == 0) { continue; } else if($key > 0 && count($linesToImport[$key])<5){ continue; } else{ if(!empty($value[10])){ if(!empty($cmp_value[10])){ $sdate = strtotime($cmp_value[10]); $cmp_sdate = date('Y-m-d h:i:s', $sdate); }else{ $cmp_sdate=''; } if(!empty($cmp_value[11])){ $edate = strtotime($cmp_value[11]); $cmp_edate = date('Y-m-d h:i:s', $edate); }else{ $cmp_edate=''; } $requestParam['cmp_id'] = $cmp_value[0]; $requestParam['flight_id'] = $value[3]; //adsGroup id $requestParam['conversions'] = $value[10]; $requestParam['impressions'] = $value[12]; $requestParam['clicks'] = $value[9]; $requestParam['ctr'] = $value[13]; $requestParam['date'] = $value[7]; $requestParam['wkno'] = (int)date("W", strtotime($value[7])); $requestParam['rev_model'] = NULL; $requestParam['cmp_name'] = $prefix . $cmp_value[2]; $requestParam['cmp_status'] = $value[2]; $requestParam['campaign_name'] = $cmp_value[2]; $requestParam['cmp_type'] = NULL; $requestParam['flight_name'] = $prefix . $value[2]; $requestParam['start_date'] = $cmp_sdate; $requestParam['end_date'] = $cmp_edate; $requestParam['budget'] = $cmp_value[9]; $requestParam['payout'] = $value[14]; $requestParam['hour'] = 0; $requestParam['ms_id'] = '178';//DBM media source id $saveCmpRptData = DbmOverviewTableForTest::saveYourCmpRptData($requestParam); if ($saveCmpRptData) { if (isset($saveCmpRptData['status']) && $saveCmpRptData['status'] == 9) { DbmOverviewTableForTest::updateYourCmpRptData($requestParam); } } else { $this->info('database not connected and something went wrong'); return; } $i++; } } } $this->info('Count Report:-' . $i); $this->info('*************************'); $this->deleteQuery($service, $queryId); } } else{ $this->info('@Error:csv_content variable is not found'); $this->deleteQuery($service, $queryId); } } } } } }catch (\Execption $e){ print($e) return; } } } } }


Run your command on terminal -

php artisan google-ads:service

You can see your output on the terminal.

If you found any difficulty in setting up Google DBM API with laravel, you can intimate me by comment.  
Read More

Tuesday, May 21, 2019

phpMyPassion

Process To Test AWS Route53 ChangeResourceRecordSets API from Terminal

Here I am explaining the process to test AWS Route53 ChangeResourceRecordSets API from terminal. you can copy below code and paste that to your terminal then hit an ENTER from your keyboard.


image credit: google images

aws route53 change-resource-record-sets --hosted-zone-id ZGN2AVYTFOA --change-batch '
{
 "Comment": "",
 "Changes": [
   {
     "Action": "CREATE",
     "ResourceRecordSet": {
       "Name": "anilc.phpmypassion.com.",
       "Type": "A",
       "AliasTarget": {
          "HostedZoneId": "Z26RNL4JYFTOTI",
          "DNSName": "phpmypassion-event-0ddd4472d016751e.elb.us-east-1.amazonaws.com",
          "EvaluateTargetHealth": false
       }
     }
   }
 ]
}'
 
"HostedZoneId": "Z26RNL4JYFTOTI" :- This hosted zone id belongs to Load Balancer Id 

--hosted-zone-id ZGN2AVYTFOA :-  This hosted zone id belongs to route53 hosted zone Id.


Output:-


{
    "ChangeInfo": {
        "Id": "/change/CWR0FJ4AJNZFT",
        "Status": "PENDING",
        "SubmittedAt": "2019-05-21T10:06:47.123Z",
        "Comment": ""
    }
}

Read More

Tuesday, March 19, 2019

phpMyPassion

Process to Create AWS Lambda Function in Python

In this article I am sharing the full and easy process of creating a AWS lambda function using python. So follow step by step process for creating AWS lambda using python 3.

********************** process to create lambda function *****************


> fill basic information like name, runtime envoironment, permission



*******************Now add Trigger from the list in left side****************


In my case I used s3

-> configure triggers basics
-> select s3 Bucket [bucket name on that you want run your trigger]
-> event type {like bucket PUT}
-> prefix
-> suffix [like - .csv, .jpg]



Note:- you can either choose prefix or suffix.

*********************** upload your lambda function code ****************


In my case I had to produced csv name as massage that was automatically uploaded in s3 bucket by python program for athena query result. So I produced massage by kafka producer on s3 put event trigger.

For setting up kafka producer on lambda trigger I did below steps..

#Install kafka in your local machine and create .zip of whole package.

Follow below steps:-

"""
# https://pypi.org/project/kafka-python/

# pip install kafka-python -t <FOLDER PATH>
# pip install requests -t <FOLDER PATH>

# README:
* Please Install the Kafka Python & Request Libs in the same folder for it to  work
    in AWS Lambda.
* When Uploading on AWS Lambda, zip the entire folder. 
   If you are uploading a zip file.  Make sure that you are zipping the contents 
   of the directory and not the directory itself.  
   Else you will get an Error: aws lambda unable to import module

zip -r ../build_dmp_dashboard_from_athena.zip *

"""

#Now you have to create a file for produced massage on trigger as below -

Lets create a file first with name lambda_producer.py

from __future__ import print_function
import sys, time
import json
import requests
from kafka import KafkaProducer

########################### Configurations #############################################
email_api = 'https://api.phpmypassion.com/api/send-mail'
email_to = '[email protected]'

kafka_server_cluster = ['cluster1', 'cluster2', 'cluster3'] 
topic = "athena_query_result"

"""
Send Email
"""
def sendmail(message):
  msg = ("{}:{}".format("lambda_producer", message))
  data = {
    'toEmail': email_to,
    'mailSubject': msg,
    'mailBody': msg
  }

  return requests.post(email_api, data)


"""
Producer Connection
"""
exception_counter = 0
for x in range(3):
  try:
    producer = KafkaProducer(bootstrap_servers=kafka_server_cluster)
    break

  except Exception as e:
    exception_counter = exception_counter + 1
    print(e)
    print("Error in Connection to kafka cluster: {0}".format(kafka_server_cluster))
    time.sleep(2)  # 2 Sec

if (exception_counter >= 2):
  message = ('Error: Looks like connection to kafka cluster failed on lambda producer: {0}'.format(kafka_server_cluster))
  print(message)
  response = fnc_sendmail(message)
  print("Email Sent Response: {0}".format(response))
  print("Program exiting")
  sys.exit(2)

"""
Lambda Handler. Received the trigger from the Code Commit Repositoy of AWS and  Produces
a message on kafka Queue
"""
def lambda_handler(event, context):
  print("Received event : " + json.dumps(event, indent=2))

  # Parse the S3 Trigger and Get the CSV Path & File from the "event"
  key_file_name = (event['Records'][0]['s3']['object']['key']) 

  print("We have a new file. file name & path : ", key_file_name)
  try:
    producer_ret_val = producer.send(topic, key_file_name.encode())  # If Topic Does not Exist then it get created automatically by python Producer
    record_metadata = producer_ret_val.get(timeout=10)
    print("sent event to Kafka! topic {} partition {} offset {}".format(record_metadata.topic, record_metadata.partition, record_metadata.offset))

  except Exception as e:
    print(e)
    print("Some Error in lambda_producer")
    raise e



#Now Lets setup thin within Aws lambda:-

-> choose "code entry type" and select the option to upload .zip


-> upload your zip file and click to save

-> your lambda code has been uploaded.

-> in my case I uploaded kafka configuration zip folder

#You have to also change lambda handler function too. as you are using lambda_handler function within lambda_producer.py file so you have to fill "lambda_producer.lambda_handler" within the above input box [shows in right red line above image].

#Now click on save button.

You done the all steps.


Read More

Sunday, March 17, 2019

phpMyPassion

How to get data from redshift table as dictionary in python

The dict cursors allow to access to the retrieved records using an interface similar to the Python dictionaries instead of the tuples.

Add following code into your cursor function

cursor_factory=psycopg2.extras.RealDictCursor

After adding the above code your cursor should be look like this -

cur = con.cursor(cursor_factory=psycopg2.extras.RealDictCursor)

if you have an error of not found class .extra -
AttributeError: module 'psycopg2' has no attribute 'extras'

Resolved this error by importing class as below

import psycopg2
import psycopg2.extras

Now if you run this code, you will get data in dictionary from redshift  as key, value pair like below :-

{'id': 1234567, 'date': datetime.date(2019, 2, 13), 'hour': 0, 'code': 42107, 'count': 6, 'revenue': None, 'payout': None, 'week': 7}










Read More

Saturday, March 16, 2019

phpMyPassion

Pandas- ImportError: Missing required dependencies ['numpy']

This is a common error that sometimes occur when importing pandas library. Here I am sharing the solution for that. So please follow as I described below and if you found any difficulty, intimate me by your comment.

Problem:-

>>> import pandas
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/anil/anaconda3/lib/python3.6/site-packages/pandas/__init__.py", line 19, in <module>
    "Missing required dependencies {0}".format(missing_dependencies))
ImportError: Missing required dependencies ['numpy']


Solution:-

check all folder list in your python lib folder

-> ls -la

Now if you see numpy folder in that. then run below command :-

sudo pip3 uninstall numpy

Now check again folder list with -

-> ls -la

Try by importing pandas -

[email protected]:~/.local/lib/python3.6/site-packages$ python3
Python 3.6.4 |Anaconda, Inc.| (default, Jan 16 2019, 18:10:19)
>>>
>>> import pandas
>>>
>>>
>>>

you will not found this error now.


Read More
phpMyPassion

Process to Change MySql ip-Address to Listen for all IP's

*************** process to change MySql ipaddress to listen for all ip ********

-> update user set Host="%" where User="phpmyadmin";

-> sudo vi /etc/mysql/mysql.conf.d/mysqld.cnf [change "bind-address = 0.0.0.0"]

-> flush privilage

-> check by netstat -nlp | grep 3306 [ouput should be start listen on 0.0.0.0]

-> telnet 192.168.1.19 3306 [now check its connecting or not with the ip]
Read More

Friday, March 15, 2019

phpMyPassion

Safest Way to Merge A Git Branch Into Master

In this article I am going to explain you about the safest way to merge a git branch(stage branch) into master branch.

Git Merge:-

Never forget to take a pull from your master branch after pushing your all new changes to your current git branch.

If your current branch(stage branch) is up to date and now you want to merge your current branch (stage branch) with master branch then follow below process.

first checkout to master branch then merge with below command

-> git pull origin <your-current-branch-name> [git pull origin master]
-> git merge <your-previous-branch-name> [git merge stage]
-> git push origin <your-current-branch-name> [git push origin master]
-> checkout your <your-previous-branch-name> [git checkout stage]

you done the all steps.
Read More
phpMyPassion

Top Mostly Used Git Commands

************************** Git clone a Repository ******************

git clone <repository_url>

************************ Check git branch *********************

git branch

************************ Git Create Branch *************

Git checkout -b <branch-name>

************************ Checkout to another branch *************

Git checkout <branch-name>

************************ Make your file same as git repository *************

Git checkout <file-name>

-> for all files :-

Git checkout .

************************* Git commit command *********************

git add . [for all files]

git commit -m "your comment"


************************* Git push command **********************

git push origin <your-branch-name>


*********************** Git pull command ************************

git pull origin <your-branch-name>

*********************** Git delete branch command ***************

git branch -d <your-branch-name>

************************** Git check current repository URL command ***********

-> git remote -v

-> git remote show origin

**************************** Git change repository remote URL command ************
$ git remote set-url origin [email protected]:USERNAME/REPOSITORY.git
Read More

Thursday, March 14, 2019

phpMyPassion

List of Top Docker Commands in Linux

Docker works just like Git.

you have to always take a pull for latest image.

****************************** Docker install ******************

sudo apt-get update
sudo apt-get install docker.io

**************************** Check docker installed or not *************

docker ps -a

************************* Login to AWS container via command line ************

 $(aws ecr get-login --no-include-email --region us-east-1)
get-login
[--registry-ids <value> [<value>...]]
[--include-email | --no-include-email]

**************************** pull the docker image *************

docker pull your-server.us-east-1.amazonaws.com/aws_repository

************************** Docker push image **************************************

-> commit your changes first

docker commit <your-container-name> your-server.us-east-1.amazonaws.com/aws_repository:v2.2 (v2.2 is the tagging  version)

-> Now run push command

docker push your-server.us-east-1.amazonaws.com/aws_repository:v2.2

you can check your latest push on server [aws/ECR/Repository/click on your repository]

*********************************** Create container with in docker *********************

docker run -it --restart always --net=subnet15 --hostname=<your-container-name> --name <your-container-name> your-server.us-east-1.amazonaws.com/aws_repository

**************************** Enter into docker container *************

-> docker exec -it <docker-container-name> bash

-> docker attach <docker-container-name>

****************************** To check docker container *******
-> docker ps

-> su md (switch to md user)

*************************** Check all cron in docker *****
crontab -e

******************** Install library into lib folder ***********************
pip3 install name -t . (within your python library path)

*************************** List Docker CLI commands **************
docker
docker container --help

*************** Display Docker version and info *************
docker --version
docker version
docker info

************ Execute Docker image **********
docker run hello-world

******************* List Docker images ***************
docker image ls

******************* List Docker containers (running, all, all in quiet mode) *****************
docker container ls
docker container ls --all
docker container ls -aq

************************* list docker all container*********************
docker container ls -a
Read More