Skip to main content

Revitalising your Data

It's a brave new world, where data is king. Sometimes, this data can live in platforms which does not let you use it as effectively as you should.  Today, I'm going to take you on a way that you can programmatically extract and transfer to a much richer set of analysis tools.

The Task

I was handed a project to move all of the data from Campaign Monitor to Google Data Studio.  No surprise, this reporting platform had a massive list of data connectors, and after a little bit of research, I decided to go with Google Cloud Storage - it just seemed to be the most flexible.

The Extraction

I wouldn't be able to go in to great detail on how I've made my Campaign Monitor class, being that its Intellectual Property, but I can go in to where you can find the relavant documentation.  I was asked to focus on just 3 streams of data; Opens, Clicks and Unsubscribes.

Be sure to follow the Getting Started guide, using the most appropriate level of authentication.  Not being a web-application, I've gone with the API Key + Secret approach.  For each stream, you will want to work through a list of the individual campaigns, it is here where you can drill down on these three streams:
You'll notice that they common parameters, in particular the date, being when campaigns were delivered from.  From those results, you can then get a further breakdown and get the subscriber data for each recipient.

The Upload

Now things get interesting.  I'll aways highly reccomend to use Composer in your projects.   You'll need to include the Google Cloud-Storage and apiclient packages to continue.

Jump over to the GCP Console and create your project.  Navigate to the Storage and create a bucket, which is effectively a named root location. For authentication and ease of use, get yourself a client keyfile, and then we can get the magic start ot happen.  Initialize the object in this manner: Then, you can upload files just as easily:

Time to Report!

Now that you've got all of your files hosted, you can now connect to it as a data source.  Depending how you have stored your files in folders, you can easily flag a data source to use all files as one single model.  Keep in mind that they must all be structed in the same way.  I hope this inspires you to get your data in shape and explore new and interesting metrics you never knew you had!

Comments

Popular posts from this blog

Running NodeJS Serverless Locally

 So it's been a long time, but I thought this was a neat little trick so I thought I'd share it with the world - as little followers as I have.  In my spare time I've been writing up a new hobby project in Serverless , and while I do maintain a staging and production environment in AWS, it means I need to do a deployment every time I want to test all of the API's I've drafted for it. Not wanting to disturb the yaml configuration for running it locally, I've come up with a simple outline of a server which continues to use the same configuration.  Take the express driven server I first define here: And then put a index.js  in your routes folder to contain this code: Voila! This will take the request from your localhost and interpret the path against your serverless.yml and run the configured function.  Hope this helps someone!

question2answer Wordpress Integration

 Today I want to journal my implementation of a WordPress site with the package of "question2answer".  It comes as self-promoted as being able to integrate with WordPress "out of the box".  I'm going to vent a small amount of frustration here, because the only integration going on is the simplicity of configuration with using the same database, along with the user authentication of WordPress.  Otherwise they run as two separate sites/themes. This will not do. So let's get to some context.  I have a new hobby project in mind which requires a open source stack-overflow clone.  Enter question2answer .  Now I don't want to come across as completely ungrateful, this package - while old, ticks all the boxes and looks like it was well maintained, but I need every  page to look the same to have a seamless integration.  So, let's go through this step by step. Forum Index Update This step probably  doesn't need to be done, but I just wanted to mak...

Auth0 - Removing Social Accounts

Greetings!  Long time it has been.  Today's post comes from a project I was handed to convert the tens of thousands of users through our website that are using social accounts into email/password logins.  It has served us well over the years, but with the on-going scrutiny from the changes at Facebook, and the integration we need to do with our online shop which does not have social authentication - makes sense it is time to remove. But what is the cleanest way? We want to make this as seamless and easy as possible for our customers and with one EDM delivery.  I have come up with the following procedure: The comments should make it fairly self-explanatory.  What you may find interesting is that we are deleting the corresponding user from Wordpress .  This will get re-created after they sign in with the new user ID.  If you are curious on the SQL that can be used to look up and delete the user by the Auth0 user_id, you can use this query: From there you...