Create IBM Cloud Service Credentials with the IBM Cloud CLI

To access an IBM Cloud service instance, you need service credentials. These credentials can be generated via the IBM Cloud Dashboard or the IBM Cloud CLI. My preferred approach is always to use the CLI, but it can be confusing the first few times you attempt it. Here’s a quick cheat sheet:

1. Create the service instance (read more about service instance creation via the CLI). Here is an example of creating a Watson Tone Analyzer instance named myta:

ibmcloud resource service-instance-create myta tone-analyzer lite eu-de

2. Create a service key associated with that service instance. Make sure that you assign an appropriate role to the key depending on the privileges required (for example, Manager or Writer for write access). Here is an example of creating a service key named myta-key:

ibmcloud resource service-key-create myta-key Manager --instance-name myta

The response to the above request contains an apikey field with an API key.

3. Obtain an OAuth access token using the apikey returned at the end of the previous step. Here is an example of making a request for an OAuth access token:

curl -X POST '' -H 'Content-Type: application/x-www-form-urlencoded' -d 'grant_type=urn:ibm:params:oauth:grant-type:apikey&apikey=YOUR-APIKEY-HERE'

The response to the above request contains an access_token field holding a time-limited OAuth access token.

4. You can now access and work with the IBM Cloud service instance by attaching the OAuth token to your requests as a credential. The actual request parameters and headers will differ from service to service – refer to the service documentation for specific details. For example, here is an example of sending an authenticated request to the Watson Tone Analyzer service instance created above:

curl -X GET '' -H "Authorization: Bearer YOUR-TOKEN-HERE"

The screenshot below shows an example session:

Create IBM Cloud Services with the IBM Cloud CLI

Most of the time, I use the IBM Cloud Dashboard with my browser to manage applications, instantiate services and check status. But sometimes, the point-and-click approach is just too slow. So, when I feel the need for speed, I switch over to the IBM Cloud CLI instead. This CLI, which is available for Windows, MacOS and Linux, provides a full-featured alternative to the browser-based IBM Cloud Dashboard.

The IBM Cloud CLI is particularly convenient for quick service instantiation. In the middle of writing code and need to launch an object storage service or a message queue for quick testing? The IBM Cloud CLI gets it done with a single command: ibmcloud resource service-instance-create.

Here’s an example of launching a Cloudant database service named mycloudant with only IAM credentials:

ibmcloud resource service-instance-create mycloudant cloudant lite eu-de -p '{"legacyCredentials": false}'

Here’s another example of launching a tone analyzer service named myta with both IAM and legacy credentials:

ibmcloud resource service-instance-create myta tone-analyzer lite eu-de -p '{"legacyCredentials": true}'

The command typically accepts five parameters (although some services may require more): the instance name, the IBM Cloud catalog service identifier, the service plan, the region code, and whether to support IAM credentials only or both IAM and legacy credentials. Once instantiated, the created services also become visible in the IBM Cloud dashboard:

In the examples above, cloudant and ta are unique service identifiers in the IBM Cloud catalog, while mycloudant and myta are the names of the instantiated services. You can obtain the complete list of available services in the catalog and their identifiers with the ibmcloud catalog service-marketplace command.

To delete an instantiated service, use the ibmcloud resource service-instance-delete command and pass it the instance name. Here’s an example of deleting the tone analyzer service created earlier:

ibmcloud resource service-instance-delete myta

The screenshot below shows an IBM Cloud CLI session using these commands:

Go Serverless with PHP

If you’re building a small, single-purpose Web application and you want to run it in the cloud, the the typical option is to reach for a small cloud server from Amazon, Google or Azure. Those servers are relatively cheap to run and easy to deploy, but there’s an ongoing investment of time and effort required to keep them secure and updated.

In many cases, a even easier solution is to run your application using a Functions-as-a-Service (FaaS) platform. This is essentially a server-less deployment which hosts your code as a standalone function (or series of inter-connected functions) in the cloud and executes it on demand in response to incoming requests, or at a pre-defined schedule. The key advantage here is there is almost no administrative overhead related to server hosting or maintenance, plus you can easily scale up (or down) the deployment as needed.

I recently had an opportunity to try IBM Cloud Functions, a FaaS platform built on Apache OpenWhisk, and was able to develop a number of purpose-built PHP applications on it. Although it took a few experiments and some time spent reading the documentation, I found that I was able to do almost everything you’d expect to do in a standard hosted PHP environment, including using external PHP libraries, integrating with third-party APIs and external databases, scheduling tasks and accessing detailed debug logs.

If this sounds like something you’d like to try for your next PHP project, take a look at my tutorial on IBM Developer, in which I discuss some common use cases (together with example code). To demonstrate real-world utility, I also walk you through building a full-fledged PHP application that integrates with Slack and OpenWeatherMap to send you automatic weather notifications for your city in your Slack workspace.

Monitor and Correlate Personal Asthma Readings with Environmental Data

For people suffering from asthma, a peak flow meter makes it easy to monitor lung capacity and understand how changes in weather and season affect their condition. Typically, this information is stored as discrete readings in an “asthma diary”.

The problem with a paper diary, though, is that as the number of readings increases, so does the difficulty of analyzing them in a holistic way. To make this easier, I decided to build a simple cloud application to both record meter readings electronically and provide visualization tools to make analysis as simple as a one-button click.

My “cloud asthma diary” allows users to input and save peak flow meter readings into an online application. Each time a reading is added, the application automatically retrieves current external weather conditions, such as temperature and humidity, and enhances the reading with this additional metadata. Users can see visualizations of the saved data, such as a bar chart of peak flow averages grouped by month, or a scatter plot of peak flow values versus temperature.

Here are examples of graphs generated by the application. The first shows changes in peak flow readings by month, while the second correlates readings with external humidity.

The application uses the IBM Cloud Cloudant service to save readings to a Cloudant database in the cloud and the IBM Cloud Weather Company service to retrieve current weather data. It also uses the Slim PHP micro-framework to process requests, Bootstrap and Twig to create a mobile-optimized user experience, Guzzle for API operations, and Chart.js to create charts and graphs from the saved data.

If this sounds like something that would help you better understand and manage your own asthma, you can download the code from GitHub and then read my IBM Developer article for details on how to deploy and use it. And if you’d like to customize it – for example, if you’d like to add extra metadata or create additional types of charts – ping me on GitHub or submit a pull request with your changes.

Migrate from CCTM to Pods in WordPress

When I first started with this WordPress blog, I installed the Custom Content Type Manager (CCTM) plugin to help me manage custom content types. Although this has been perfectly adequate for my needs, I’ve been looking to migrate over to the Pods plugin for a while now, if only because it seems to offer a bunch of additional features (relationships, a REST API…) and a nicer interface. And recently, when CCTM appeared to have crashed on me for a couple of days (turned out to be a stale cache), I decided to take the plunge.

My blog uses three custom content types: Books, Articles and Presentations. Pods’ documentation is extensive, but it didn’t cover my key problem: transferring control of these existing content types, hitherto managed by CCTM, over to Pods without either corrupting my data or having to spend time exporting and re-importing my posts.

After a few false starts (and database restores), I finally hit upon a simple process to transfer control of my custom content types from CCTM to Pods. At the end of the process, I was able to remove the CCTM plugin and view and manage all my custom content through the Pods interface.

In case you ever need to do this, here are the steps I performed. The steps here reference a Book content type, and you’ll need to repeat the process for each of your custom types:

  1. Backup the WordPress database.
  2. Install and activate the Pods plugin.
  3. Log in to your MySQL database using the CLI and run the following SQL query to rename posts of type book to a (temporary) new identifier. This step is necessary to avoid a name collision with Pods.
    UPDATE wp_posts SET post_type='unspec' WHERE post_type='book';

  4. From the CCTM plugin administration page, make a list of the fields attached to the Book content type and then deactivate it. The process should affect 0 posts (because the database no longer has any records with the book identifier).
  5. From the Pods plugin administration page, clear the Pods cache.
  6. Create a new Book pod.
  7. Add fields to the new Book pod, ensuring that the field names are identical to those originally attached to the Book content type in CCTM. It’s important to ensure the field names match exactly, as otherwise you might lose some of your custom metadata.
  8. Save the new Book pod.
  9. Log back in to your MySQL database using the CLI and run the following SQL query to change your posts back to type book.
    UPDATE wp_posts SET post_type='book' WHERE post_type='unspec';

  10. Check the list of Books in the Pods administration interface. You should see that the Pods plugin now recognizes all your Books, and you can use the Pods administration interface to add, modify and delete Book posts.

Once you complete transferring control of your CCTM content types to Pods, you can simply deactivate and uninstall the CCTM plugin. All your previous custom content can now be managed through the Pods interface.

Track Time Spent on Projects with IBM Cloud

Many, many years ago, I wrote an article about building a timesheet system to track and analyze work hours with PHP 4.x. It was a fairly useful little application, one that I ended up using quite a bit in my daily work as a consultant. As my requirements increased, I moved on to other project management systems…but I always had a soft spot for this, the one I used with my first few clients.

Fast-forward to 2018 and it occurred to me that it might be fun to go back and visit the Boss, the Customer and the Lazy Programmer and try to rebuild my original application, but with a modern spin: using a micro-framework with the latest version of PHP, scalable cloud-based data storage and PaaS infrastructure.

The application was fun to build and the result is a modern, lightweight and mobile-optimized evolution of my 2001 design. Essentially, the tool allows busy professionals (like lawyers and accountants) to define to define one or more projects, each representing a customer job and enter, on an ongoing basis, the hours worked on each project. Data is stored online, and users can view or export a report of hours worked per project at any time.

Here is a screenshot of the data entry screen:

And here’s another of the reporting screen:

As you can see, the entire application is mobile friendly, enabling users to enter data and view reports even when on the move (perfect for professionals who don’t have a fixed office or place of work). Behind the scenes, the application uses MySQL for data storage, Bootstrap for the mobile-optimized interface, the Slim PHP micro-framework for request processing, and the IBM Cloud CloudFoundry platform to deploy and scale.

If you’re looking for a lightweight time tracking tool for your projects, fork the code on GitHub and use it for your own business. If you’re interested in finding out how to deploy or customize the application, read my IBM Developer article for the full details.

Test and Deploy PHP Applications Automatically on IBM Bluemix

Continuous delivery is pretty awesome. There’s no fiddling about manually packaging or transferring code; instead, your code is automatically packaged and deployed on an ongoing basis, and you can see (and test) the results within a few minutes of making a change.

Up until recently, if you were using Bluemix, you needed to rely on external or third-party services coupled with various custom scripts to implement continuous delivery for your Bluemix applications. To make things easier, a Bluemix Continuous Delivery service was recently introduced, which comes tightly integrated with Bluemix out of the box and which provides a secure, automated way to build, test, and deploy your Bluemix applications.

I recently had an opportunity to try the new Bluemix Continuous Delivery service with a PHP application on GitHub. In my usage scenario, I wanted my target Bluemix deployment to always reflect the latest “tests passed” version of the PHP application’s dev-master branch.

The basic process I settled on was:

  • Each time a pull request is created in the GitHub source repository, the source code will be automatically tested by Travis CI with PHPUnit to ensure all unit tests pass.
  • If the unit tests pass, the pull request will be manually or automatically merged into the repository’s development branch. This merge will automatically trigger a new deployment of the application on Bluemix using a Bluemix Continuous Delivery toolchain.
  • If the unit tests fail, the pull request will not be merged and the current deployment on Bluemix will remain unaffected.

Here’s what the process looks like:

Sounds interesting? To find out how to do the same for your Bluemix applications, read my IBM developerWorks article which has a complete walkthrough of the steps.

PHP and Bluemix at the IBM Bluemix Meetup Dubai

I’ll be speaking at the IBM Bluemix Dubai user group meeting next week.

  • Using PHP with IBM Bluemix (Mon Aug 28 2017 at 19:00)
    This session covers the basics of developing and deploying PHP applications on IBM Bluemix. It will be based heavily on my experiences working with different IBM Bluemix services, which I’ve also described in various articles. It will include a hands-on coding demonstration that you can follow along with on your laptop.

If you’re in Dubai next weekend, please drop by, grab a cup of coffee and say hello!

Update: Slides are now available, as are photos of the event.

PHP and Bluemix at the IBM Cloud Meetup Mumbai

I’ll be speaking at the IBM Cloud user group meeting in two days.

  • Using PHP with IBM Bluemix (Sat Jul 29 2017 at 10:00)
    This session covers the basics of developing and deploying PHP applications on IBM Bluemix. It will be based heavily on my experiences working with different IBM Bluemix services, which I’ve also described in various articles. It will include a hands-on coding demonstration that you can follow along with on your laptop.

If you’re in Mumbai over the weekend, please drop by, grab a cup of coffee and say hello!

Update: Slides are now available.

Build a Searchable CV Database with IBM Bluemix and PHP

If you work as an independent recruiter or an in-house HR manager, you’re probably inundated with CVs from aspiring job candidates. You’re also probably collecting data from other sources: social media, LinkedIn profiles and your own interview notes. And the more data you collect, the harder it becomes to spot the right candidate for a particular requirement.

If that sounds familiar to you, relax. There are better ways of matching job candidates with business needs, especially if you let technology help you. Close your eyes and imagine an application that lets you upload all those CVs and notes, scans and indexes them, and gives you a way to search them using skill keywords like “PHP” or “aerospace engineer”.

Sounds good? I thought so too. That’s why I recently built a mobile-optimized app for CV indexing and search, which does everything I just said above.

It’s built using PHP and tightly integrated with both Searchly (for fast data indexing and search) and Bluemix Object Storage (for CV upload and download). Try out the demo on Bluemix, get the code from GitHub and read all about it in my developerWorks article.