Last week, we held a webinar where we explained how the event-driven processing system integrated into our object store works. The goal of this presentation was first to introduce the technology, explaining how it works under the hood, but also to perform a live demo using one of the most successful open source machine learning software libraries: Tensorflow.
If you are familiar with OpenIO and this blog, you may already be aware of the many tasks you can perform with our event-driven storage system, like in the article where I explain How to recognize faces in pictures within our object storage solution.
In the webinar, we built a more complex workflow. Any time a picture is uploaded using S3/Swift/OpenIO APIs we process it using Tensorflow, detecting its content while enriching the object’s metadata, and then pushing it to Elasticsearch (with the goal of being able to perform searches across all the objects at any time).
Regarding the metadata indexation using Elasticsearch, I strongly recommend you to read the article Simple Metadata Indexing through Grid for Apps, where we detail how it works.
Because a snippet of code is often worth a thousand words, here is the workflow that we will implement:
Let's do it!
Like in our previous article, we will use our Docker container image to easily spawn an OpenIO SDS environment. We will also use the Docker image of Elasticsearch to deploy it.
Retrieve the latest Elasticsearch Docker image (5.4.0 to date):
And start an Elasticsearch instance:
Retrieve the OpenIO SDS Docker image:
Start your new OpenIO SDS environment:
You should now be at the prompt with an OpenIO SDS instance up and running.
Next, we will configure the trigger, so that each time you add a new object, metadata from the object will be pushed to Elasticsearch. Add the following content to the file /etc/oio/sds/OPENIO/oio-event-agent-0/oio-event-handlers.conf:
(If you want to learn more about this configuration file, please refer to our previous article.)
Next, restart the OpenIO event agent to enable the modification:
Your event-driven system is now up and running.
Before writing the script, we will first need to install some modules:
Finally, install Tensorflow:
And we can now write the script; let’s call it detect-pattern.py.
As the script is pretty long, it was uploaded to github. Download it using curl:
You will have to modify the IP address of the Elasticsearch instance in the script. In my case, the IP address of my machine was 192.168.99.1. Change the following line:
Change it according to your environment.
Finally, launch it in background:
Please note that the script is written in Python, but you can write it any other language.
How does it work?
It’s time to add a new picture to see if it works. First of all, let's download this picture:
Using the OpenIO CLI, let’s upload it to the container mycontainer in the account myaccount.
Well done! You just uploaded it, while, in the background, its metadata was enriched with its category, and indexed in Elasticsearch. Let's check the metadata belonging to this new object:
With the following return:
To conclude, let’s ask to Elasticsearch to find all the objects that match the property configfile:
Our newly uploaded file will be detected by Elasticsearch as matching the request "query": "volcano".