This is part of Solutions Review’s Premium Content Series, a collection of reviews written by industry experts in maturing software categories. In this presentation, CelerData’s Vice President of Strategy, Li Kang, provides an overview of real-time analytics use cases so you can stay informed.

Premium SR ContentIt has become increasingly easy to create artificial intelligence (AI) solutions. You don’t have to be a full-fledged data scientist to train a machine learning (ML) model. You don’t have to be a full-fledged programmer to include AI in your business applications. The right low-code tools can take you far in the development and production of AI-based applications.

The AI-based app

Recently, we developed a relatively complex AI application that includes:

  • Collection of data in the form of user comments on a simple online game.
  • Sentiment analysis of user feedback with various ML and natural language processing (NLP) models.
  • Production and orchestration of all modules together.

AI models were trained and individual modules developed using a low-code platform for data science. The final application was then made available to the real world via a dedicated deployment and production platform.

The online game and user feedback collection

The online game was a “Guess the Flag” game. The user is presented with a randomly selected flag and has three chances to guess which country they belong to. Each game session includes 10 flags. At the end of each session, user pass and fail statistics are presented via a dashboard, and the user has the option to leave a review.

Picture 1

The whole game was implemented as a web application without writing a single line of code. The low-code tool for data science, KNIME Analytics Platform, relies on the concept of “components” to implement interactive web pages. Although initially intended to look like dashboards, KNIME components are so flexible that we could implement the entire web interface of the game with:

  • Presentation of the flag
  • Response options
  • The Final Stats Dashboard
  • The feedback form

Picture 2

The sentiment analysis app

Out of all the reviews collected in the last 24 hours, a sentiment analysis application has been applied to evaluate the user experience of the game. There are many ways to implement a sentiment analysis application, depending available time, costs, expectations and data.

  • A solution based on classic NLP rules and statistics is easy to implement but requires specific linguistic knowledge.
  • In recent years, solutions based on machine learning algorithms have been favored. In this case, however, a representative data set must be available to train the algorithm.
  • Also in the area of ​​machine learning-based sentiment analysis applications, deep learning models have shown better performance, although they still require more example data for training.
  • Recently, pre-trained transformer models have been made available by companies with access to large amounts of textual data.

Below is the KNIME workflow that implements a classic NLP-based solution for sentiment analysis. We chose to implement this approach based on NLP for its simplicity. Any other of the approaches listed above could also have been implemented.

Picture 3

Production and Orchestration

Before moving the two applications, Flag Set and Sentiment Analysis, to production, we modularized them as much as possible. A service has implemented the evaluation of indicators and responses, a service for collecting feedback, a service for sentiment analysis and a service for generating reports. Each service consisted of a logical block and was produced as a standalone REST service.

The orchestration of all services was achieved through the functionality of the KNIME server. KNIME Server can schedule the execution of a module, call modules in cascade after the execution of other modules and exploit the services provided by another module using the Call Workflow Service node.

The stages of success

Sounds easy, right? And it was. However, there is more to this project than meets the eye. Many steps were necessary for a successful implementation:

  • Management Acceptance
  • Project definition
  • Search for skills
  • Web development
  • Training and deploying AI models
  • Service production
  • Orchestration

Let’s start at the bottom of the list. The open-source low-code tool we used, KNIME Analytics Platform, took care of web development, AI model training, and model deployment. Its commercial companion, KNIME Server, handled production and orchestration. Both turned out to be the easiest parts of the whole project, mainly thanks to the low-code approach.

So, the key to the success of this project was to reach the development phase with very clear ideas about what we wanted to develop, the resources we could devote to it and the support we needed from management. Nothing really compares to clear ideas. Indeed, this was the most complex part of the project: understanding what needed to be developed and how all the pieces could work together. It took a few meetings and more than a few conversations to come up with a clear and definitive vision.

The objective of the project has always been to evaluate the comments left by users. We started from book reviews, but users should have read at least one or two books first and then left comments. The feedback collection process would have been too slow. We thought about the songs, but the copyright issue came up. We were thinking of restaurants, but here we would not have owned the data. Then one of us had the idea to create a game. Gamification is a powerful tool to get users to interact. In order to avoid copyright issues, we created our own game with flags, to which we added a feedback collection form. Web development here would require basic UI skills to make it as easy to use as possible.

The second part was the evaluation of the feedback texts. For this, even a novice data scientist would have been able to implement a sufficiently accurate solution based on one of the four sentiment analysis approaches listed above.

Finally, the orchestration and release process had to be handled by a data engineer, someone with an eye for spotting logical isolated tasks and turning them into services.

Once all of that was clarified, we discussed the idea with management and got their support pretty quickly; we have recruited the required professional profiles – a data engineer, a data scientist and a student expert in UI; and we chose an open-source low-code tool to contain costs and avoid hiring JavaScript programmers, Python experts and IT professionals. After that, and probably thanks to that, the development and production phases didn’t seem like such big hurdles anymore.

Rosaria Silipo
Last posts by Rosaria Silipo (see everything)