https://www.youtube.com/watch?v=SGUCcjHTmGY
This video, put together by OpenAI's founders, covers the development of Codex and several examples of how it can be used in practice.
No-Code or "codeless" programming allows anyone, whether they have any program coding knowledge or not, to build working applications for the web and for mobile devices. With no-code, you define WHAT the program does, not HOW it does it, and the application platform handles the coding. The purpose of this blog is to provide news, articles, no-code examples, and additional information to help you “master” no-code app building - the future of software development.
https://www.youtube.com/watch?v=SGUCcjHTmGY
This video, put together by OpenAI's founders, covers the development of Codex and several examples of how it can be used in practice.
The hottest trends in no-code right now are workflow automation, machine learning, and data visualization. Airtable just made a move to bolster its ability to provide data analysis visualization options for Airtable users by buying Bayes.com.
Bayes' system proactively recommends ways to look at and analyze your data as soon as you upload your data file. Many of those data visualization methods are reminiscent of what users try to do with Excel - which should fit right in with Airtable's spreadsheet format.
With Bayes you also have the ability to easily export your charts and graphs to emails, Power Point presentations or other media. And you can also use your data visualizations in an interactive mode, allowing others to view and comment on each example.
It's probably going to take a little while for Airtable to get the new forms of data analysis integrated into their platform, but it should add an important feature to their system.
A few days ago OpenAI (www.openai.com) announced the release of a new version of OpenAI Codex, their Al system that translates natural language statements into program code. Now you can use English to describe what you want your software project to do, and OpenAl’s Al model will automatically generate the corresponding computer code, in whatever programming language you choose.
OpenAI Codex is a "descendent" of GPT-3 (a neural network machine learning model) and is capable of producing code in Python, JavaScript, Go, Perl, PHP, Ruby, Swift and TypeScript. Codex is basically a version of GPT-3, but one that has been trained on program code instead of ordinary text material. That allows Codex to do things like complete lines of code or entire routines, but originally it wasn't truly a tool the non-coders could easily use.
That’s changed with the new API version, which interprets everyday requests like “make the ball bounce off the sides of the screen” or “download this data using the public API and sort it by date,” and creates working program code in any one of a dozen languages. Also, in addition to being "fluent" in a variety of coding languages, the fact that Codex is trained on pretty much all the public code on GitHub (and other repositories), means it’s aware of standard practices in coding.
Because of its ability to handle Natural Language processing, Codex can be used to interact with other software, to perform functions within that software that aren't built into the application. OpenAI also says that Codex can convert or "translate" code from one programming language to another, providing code portability.
Right now Codex is only available to a select group of beta testers. Once, its made available to a wider segment of the public I'll attempt to get a copy and work out an actual example or two. Until then, Codex may be a product to keep a close eye on.
In Part 1 of this post we covered no-code machine learning platforms in general. In this post I want to go through a few actual examples of how these types of platforms actually work. One thing that needs to be done regardless of the platform involved is to go through the input data (if it hasn't already been prepared for analysis) and get it in the proper format.
If you're supplying your own data the first step in building a machine learning model is to "clean" the input data that will be used to train the model. For example, let's say you have a voter file and want to make some predictions about voting patterns based on how long ago each person registered to vote. Some of the registration dates may be missing or incomplete (such as 9/22/0), some may be entered with dashes (5-14-1987) and others with slashes (5/14/1987), and some may obviously be incorrect (11-03-1847). The missing and incorrect entries need to be removed from the input data and the incomplete records may be removed or an attempt may be made to add the missing portion of the date. Once that's done, all the records need to formatted in the same way for consistency. Data that's in some form other than numbers (such as images, sound clips, etc.) may require a different approach, but it will still require preparation of some type in order to use it to train a machine learning model.
At this point you're ready to choose a platform and begin building your ML model. These days there are quite a few no-code platforms to choose from - here are several examples:
To see Teachable Machine in action search for: “Teachable Machine Tutorial: Bananameter” by Barron Webster (November 7, 2019). The tutorial shows how he used Teachable Machine to identify images of bananas that aren't ripe yet, bananas that are ripe, and bananas that are past their prime. And for a more in-depth look at how Teachable Machine works, take a look at: “How to build a Teachable Machine with TensorFlow.js”, Nikhil Thorat – deeplearnjs.org.
Scroll down the web page to the heading “How to use Machine Learning in Excel?” and follow the sequence of steps involved. The pre-trained model used is designed to do “sentiment analysis”. It analyzes customer comments uploaded from an Excel spreadsheet, predicts the main sentiment in each set of comments, and returns an Excel spreadsheet with a new column containing the model's predictions. There's also a section in the article covering how to create, train and use a “classification” model, as well as a link to a comprehensive guide to creating compelling data visualizations in Excel.
This particular model scans images of metal pump impellers and classifies them as either defective or not defective, which makes this a binary image classification problem. The details of how a deep learning model works can be complicated but you don't need to understand all the details in order to build a model like this. Rather than going through the whole process, I'll just list the steps involved:
Once the model has been trained, the final step is to deploy it and see how it performs on new data that it hasn't seen before. Here's how to deploy the model:
If there's a problem with the results of the deployment test, you can review the different reports from the training run and see if you want to make changes in the project setup and rerun the experiment.
That's a quick look at how no-code AI platforms work. Since this is one of the fastest growing parts of the no-code landscape I plan on doing several more posts on this topic in the near future. In the meantime, there are many other platforms like the ones mentioned above - if you're interested in machine learning, by all means pick one and try building a few models yourself.
The fastest-growing area in programming these days is data analysis. It has become a necessity for businesses of all sizes in order to determine mission critical information such as trends in the marketplace, ways to reduce customer churn, and the ability to pinpoint product line strengths and weaknesses. Until recently only large organizations could afford the services of data analysts, specialists who could construct the artificial intelligence models needed to uncover that information. Now, there are an increasing number of "no-code" platforms that allow anyone to quickly build an accurate machine learning model - without the need for coding or training in data science.
As machine learning models become easier to create, they're also coming into play in areas other than the business world. ML models have been used to make predictions ranging from where and when wildfires are most likely to occur to identifying optimal Covid-19 testing and vaccination strategies. In fact, as time goes on and machine learning (as well as deep learning) models become more sophisticated, no-code AI platforms are bound to be used for data analysis in more and more areas.
Popular no-code machine learning platforms include Teachable Machine (by Google), AutoML (part of Google Cloud), Peltarion, Big ML, Obviously AI, CreateML (for Mac computers), Google ML Kit (to generated models for Android and iOS devices), and MonkeyLearn. Various studies have indicated that platforms like these have the potential to reduce development time for machine learning models by up to 90 percent. They also allow the end users (the people who are the actual decision makers) to construct the exact model they need.
Along with empowering "citizen developers", no-code AI platforms also provide a couple of other major benefits. They allow experienced data scientists to focus their time and effort on more complex projects which may also lead to more effective no-code solutions. And they make it possible to try out AI solutions much faster and with much less expense involved than with traditional methods. Building models quickly and cheaply can greatly increase the chance of finding a really useful solution to a particular problem, since it's not always clear what type of algorithm would work best.
In some cases a no-code platform will require you to select an algorithm or type of model to fit with your data, select certain training parameters, and then "train" the model on your dataset. Other platforms may simply have you pick one of their pre-trained models and use it to analyze your dataset. In either case the process is done without requiring any coding. Training a model on your own data takes longer, but it can result in a more accurate result. Using a pre-trained model usually works best when you're dealing with a common machine learning problem - like predicting customer churn - where there are standard models available.
The general steps in training a machine learning model include:
Years ago when "shareware" was a big thing, I wrote a do-it-yourself desktop database program. The program sold a lot of copies and a major PC magazine even featured it in an article about database management software. The problem was that I included every feature I could dream up before I released it, to try and make it the most versatile database software on the shareware market. As more people started using it I began to get all kinds of suggestions about things I should have done differently or functions that would be great to add to the program.
The first few changes I made were fairly small, but some of the suggestions people were making on my forum required major re-working of the program code. I ignored some of the "...wouldn't it be great if..." ideas, but there were other changes and additions that really needed to be included. I managed to make the modifications, but it became harder with each new feature I added since the program began to resemble a tire that had been patched too many times. The code got more and more interwoven and testing each new version became almost a full-time process.
With no-code you don't have to worry about "spaghetti code", but you can still end up with a program that doesn't flow well and requires major re-working each time you make a change if you start off with what you believe is a "finished" product. Inevitably, there are going to be some changes and additions needed once other people start using the app. It's much easier to create a clean, well-organized, easy-to-use application if you get feedback from actual users before you try building out the final product.
You're starting a new project and you've decided to use a no-code or low-code platform to build your app - but which platform should you choose? There are more and more no-code and low-code app development platforms showing up every week and picking one to use can be really confusing. So how do you go about making that choice? Here are a few suggestions that may help you answer that question: