Thursday, August 12, 2021

Airtable Acquires Bayes.com

The hottest trends in no-code right now are workflow automation, machine learning, and data visualization. Airtable just made a move to bolster its ability to provide data analysis visualization options for Airtable users by buying Bayes.com. 

Bayes' system proactively recommends ways to look at and analyze your data as soon as you upload your data file. Many of those data visualization methods are reminiscent of what users try to do with Excel - which should fit right in with Airtable's spreadsheet format. 

With Bayes you also have the ability to easily export your charts and graphs to emails, Power Point presentations or other media. And you can also use your data visualizations in an interactive mode, allowing others to view and comment on each example.

It's probably going to take a little while for Airtable to get the new forms of data analysis integrated into their platform, but it should add an important feature to their system.

Wednesday, August 11, 2021

OpenAI Codex - A New Brand of No-Code

A few days ago OpenAI (www.openai.com) announced the release of a new version of OpenAI Codex, their Al system that translates natural language statements into program code. Now you can use English to describe what you want your software project to do, and OpenAl’s Al model will automatically generate the corresponding computer code, in whatever programming language you choose. 

OpenAI Codex is a "descendent" of GPT-3 (a neural network machine learning model) and is capable of producing code in Python, JavaScript, Go, Perl, PHP, Ruby, Swift and TypeScript. Codex is basically a version of GPT-3, but one that has been trained on program code instead of ordinary text material. That allows Codex to do things like complete lines of code or entire routines, but originally it wasn't truly a tool the non-coders could easily use.

That’s changed with the new API version, which interprets everyday requests like “make the ball bounce off the sides of the screen” or “download this data using the public API and sort it by date,” and creates working program code in any one of a dozen languages. Also, in addition to being "fluent" in a variety of coding languages, the fact that Codex is trained on pretty much all the public code on GitHub (and other repositories), means it’s aware of standard practices in coding. 

Because of its ability to handle Natural Language processing, Codex can be used to interact with other software, to perform functions within that software that aren't built into the application. OpenAI also says that Codex can convert or "translate" code from one programming language to another, providing code portability. 

Right now Codex is only available to a select group of beta testers. Once, its made available to a wider segment of the public I'll attempt to get a copy and work out an actual example or two. Until then, Codex may be a product to keep a close eye on. 

Friday, August 6, 2021

No-Code Data Science (Part 2)

In Part 1 of this post we covered no-code machine learning platforms in general. In this post I want to go through a few actual examples of how these types of platforms actually work. One thing that needs to be done regardless of the platform involved is to go through the input data (if it hasn't already been prepared for analysis) and get it in the proper format.

If you're supplying your own data the first step in building a machine learning model is to "clean" the input data that will be used to train the model. For example, let's say you have a voter file and want to make some predictions about voting patterns based on how long ago each person registered to vote. Some of the registration dates may be missing or incomplete (such as 9/22/0), some may be entered with dashes (5-14-1987) and others with slashes (5/14/1987), and some may obviously be incorrect (11-03-1847). The missing and incorrect entries need to be removed from the input data and the incomplete records may be removed or an attempt may be made to add the missing portion of the date. Once that's done, all the records need to formatted in the same way for  consistency. Data that's in some form other than numbers (such as images, sound clips, etc.) may require a different approach, but it will still require preparation of some type in order to use it to train a machine learning model.

At this point you're ready to choose a platform and begin building your ML model. These days there are quite a few no-code platforms to choose from - here are several examples:

  • Teachable Machine (https://teachablemachine.withgoogle.com/) - Google's Teachable Machine is a good place to start to get a feeling for how no-code AI platforms work. You can train models yourself to recognize certain images, sounds or poses (like whether you're sitting down or standing up), and host those models for free directly on the Teachable Machine website or in your own website or app. The models are built using TensorFlow.js, a JavaScript library for machine learning and are trained using a method called “transfer learning”. In transfer learning a fully trained neural network model is used, but the original training data is replace with your input. 

To see Teachable Machine in action search for: “Teachable Machine Tutorial: Bananameter” by Barron Webster (November 7, 2019). The tutorial shows  how he used Teachable Machine to identify images of bananas that aren't ripe yet, bananas that are ripe, and bananas that are past their prime. And for a more in-depth look at how Teachable Machine works, take a look at: “How to build a Teachable Machine with TensorFlow.js”, Nikhil Thorat – deeplearnjs.org.

  •  MonkeyLearn - MonkeyLearn is a no-code machine learning platform that specializes in text classification and extraction models which you can use to analyze your raw data in Microsoft Excel. To learn exactly how this works, there's a step-by-step explanation of the process involved at: https://monkeylearn.com/blog/machine-learning-in-excel/.
Scroll down the web page to the heading “How to use Machine Learning in Excel?” and follow the sequence of steps involved. The pre-trained model used is designed to do “sentiment analysis”. It analyzes customer comments uploaded from an Excel spreadsheet, predicts the main sentiment in each set of comments, and returns an Excel spreadsheet with a new column containing the model's predictions. There's also a section in the article covering how to create, train and use a “classification” model, as well as a link to a comprehensive guide to creating compelling data visualizations in Excel.

  • Peltarion - Peltarion has a detailed tutorial to help explain howdeep learning” models are built (https://peltarion.com/blog/applied-ai/defect-detection). The model is designed to detect surface defects in manufactured parts. Properly trained deep learning models are faster, more accurate and more consistent than manual inspections or software using traditional programming. Note: Deep learning is a subset of machine learning and deep learning models are normally trained with artificial neural network algorithms (algorithms that attempt to simulate the reasoning process of the human brain).

This particular model scans images of metal pump impellers and classifies them as either defective or not defective, which makes this a binary image classification problem. The details of how a deep learning model works can be complicated but you don't need to understand all the details in order to build a model like this. Rather than going through the whole process, I'll just list the steps involved:
    • Create a new project and give it a name.
    • Click the Data library button and search for the “Defects in metal casting” dataset.
    • If you agree with the dataset license, click “Accept” and import the data.
    • Next, you need to pre-process the data, which is split into 80% training data, 10% evaluation data (used to test the model on unlabeled data at the end of eachepoch” or training cycle to see how well the training is progressing), and 10% of the data which is saved for any further testing that may be needed.
    • Click the wrench icon to display theFeature Settings” and make sure “Encoding” is set to “binary” (since the model only has two outcomes - “true”, the part is defective and “false”, the part isn't defective). Also make sure “Positive class” is set to “true”.
    • Click “Save version”, then click “Use in new experiment”.
    • Note: The pre-processing is complete – the next step is to build the model.
    • Click Input(s)/Target” in the Experiment Wizard and make sure the Input feature is set to “Image” and the target is set to “Defective”.
    • Click the Snippet” tab and select “EfficientNet B0”.
    • Click the Weights” tab. The EfficientNet snippet can be trained with pre-trained weights (which allows you to make use of knowledge gained from another dataset). Using pre-trained weights saves time and improves efficiency.
    • Click “Create” to move to the Modeling View”.
    • Click on the “Input” block, select “Image augmentation” and click “Natural images”. Image augmentation adds more variation to the images – it isn't necessary, but it may help to increase the model's accuracy.
    • Click “Run” to start the experiment running.
    • Once the experiment has run to completion, click Evaluation” at the top of the screen to view the results.
    • There are a number of “loss metrics” in the Evaluation view, but the primary ones for this experiment are “binary accuracy” (the percentage of correct predictions) and “recall” (the percentage of actual defective parts that were classified as defective by the model).
    • To inspect the model's predictions, select the subset and epoch (training cycle) you want to look at and click “Inspect”.
    • There are four possibilities for each prediction the model makes – a non-defective impeller is classified correctly as not defective (“false”), a non-defective impeller is classified incorrectly as “defective” (“true”), a defective impeller is classified correctly as defective (“true”), or a defective impeller is classified incorrectly as not defective (“false”).
    • Looking at theConfusion Matrix” will show the percentages of each outcome (true positive, true negative, false positive, and false negative). Those figures will tell you a lot about how well the model did, but the most important statistic is the number of false negatives (which results in sending out a defective part). Another important metric is the ROC (Receiver Operating Characteristic) curve, which plots true positive rates versus false positive rates.

Once the model has been trained, the final step is to deploy it and see how it performs on new data that it hasn't seen before. Here's how to deploy the model:

    • Download and unzip the test data (the portion of the data that wasn't used to train the model).
    • Go to theDeployment view” and click “New deployment”.
    • When the “Create deployment” screen displays, select the experiment you want to deploy and choose the epoch (training cycle) that had the best results to be your checkpoint.
    • Click the “Enable” button to deploy the experiment.
    • Click the “Test deployment” button to try the model out on the test data you downloaded.

If there's a problem with the results of the deployment test, you can review the different reports from the training run and see if you want to make changes in the project setup and rerun the experiment.

That's a quick look at how no-code AI platforms work. Since this is one of the fastest growing parts of the no-code landscape I plan on doing several more posts on this topic in the near future. In the meantime, there are many other platforms like the ones mentioned above - if you're interested in machine learning, by all means pick one and try building a few models yourself. 


Tuesday, August 3, 2021

No-Code Data Science


The fastest-growing area in programming these days is data analysis. It has become a necessity for businesses of all sizes in order to determine mission critical information such as trends in the marketplace, ways to reduce customer churn, and the ability to pinpoint product line strengths and weaknesses. Until recently only large organizations could afford the services of data analysts, specialists who could construct the artificial intelligence models needed to uncover that information. Now, there are an increasing number of "no-code" platforms that allow anyone to quickly build an accurate machine learning model - without the need for coding or training in data science.

As machine learning models become easier to create, they're also coming into play in areas other than the business world. ML models have been used to make predictions ranging from where and when wildfires are most likely to occur to identifying optimal Covid-19 testing and vaccination strategies. In fact, as time goes on and machine learning (as well as deep learning) models become more sophisticated, no-code AI platforms are bound to be used for data analysis in more and more areas.

Popular no-code machine learning platforms include Teachable Machine (by Google), AutoML (part of Google Cloud), Peltarion, Big ML, Obviously AI, CreateML (for Mac computers), Google ML Kit (to generated models for Android and iOS devices), and MonkeyLearn. Various studies have indicated that platforms like these have the potential to reduce development time for machine learning models by up to 90 percent. They also allow the end users (the people who are the actual decision makers) to construct the exact model they need. 

Along with empowering "citizen developers", no-code AI platforms also provide a couple of other major benefits. They allow experienced data scientists to focus their time and effort on more complex projects which may also lead to more effective no-code solutions. And they make it possible to try out AI solutions much faster and with much less expense involved than with traditional methods. Building models quickly and cheaply can greatly increase the chance of finding a really useful solution to a particular problem, since it's not always clear what type of algorithm would work best.

In some cases a no-code platform will require you to select an algorithm or type of model to fit with your data, select certain training parameters, and then "train" the model on your dataset. Other platforms may simply have you pick one of their pre-trained models and use it to analyze your dataset. In either case the process is done without requiring any coding. Training a model on your own data takes longer, but it can result in a more accurate result. Using a pre-trained model usually works best when you're dealing with a common machine learning problem - like predicting customer churn - where there are standard models available.

The general steps in training a machine learning model include:

  • Choosing a dataset from the platform's data store or uploading your own file.
  • "Cleaning" the input data - removing any empty or outlier items and possibly "normalizing" the data to get everything on roughly the same scale.
  • Choosing a type of algorithm (such as a classification or regression algorithm) to use to train a model on your data. Each algorithm has certain settings that can be adjusted to produce the best fit with your data. The platform may analyze the data and automatically choose what it considers the optimum settings or you may be able to adjust some of the settings yourself.
  • Using the majority of the input data to train the model and a small portion for validation to see if the model produces accurate results.
  • Deploying the finished model to a website or as part of an app.
That covers the general characteristics of a no-code machine learning platform. In Part 2 of this post I'll go over a few specific examples of creating a machine learning model and what results to expect from that model. 

Thursday, July 29, 2021

Start With a Minimum Viable Product (MVP) Version of Your App


You've probably heard or read this advice before, but it pays to create a minimal or "skeleton" version of your app and use that for initial testing. There's a strong temptation when you start building your app to include every feature you can think of before you release it to beta testers. Why create just a bare bones version? There are several reasons, but the primary reason is so you don't have to rebuild half of your app when you start getting feedback about changes and improvements.

Years ago when "shareware" was a big thing, I wrote a do-it-yourself desktop database program. The program sold a lot of copies and a major PC magazine even featured it in an article about database management software. The problem was that I included every feature I could dream up before I released it, to try and make it the most versatile database software on the shareware market. As more people started using it I began to get all kinds of suggestions about things I should have done differently or functions that would be great to add to the program.

The first few changes I made were fairly small, but some of the suggestions people were making on my forum required major re-working of the program code. I ignored some of the "...wouldn't it be great if..." ideas, but there were other changes and additions that really needed to be included. I managed to make the modifications, but it became harder with each new feature I added since the program began to resemble a tire that had been patched too many times. The code got more and more interwoven and testing each new version became almost a full-time process.

With no-code you don't have to worry about "spaghetti code", but you can still end up with a program that doesn't flow well and requires major re-working each time you make a change if you start off with what you believe is a "finished" product. Inevitably, there are going to be some changes and additions needed once other people start using the app. It's much easier to create a clean, well-organized, easy-to-use application if you get feedback from actual users before you try building out the final product. 

Monday, July 26, 2021

Which Platform Should You Use?

You're starting a new project and you've decided to use a no-code or low-code platform to build your app - but which platform should you choose? There are more and more no-code and low-code app development platforms showing up every week and picking one to use can be really confusing. So how do you go about making that choice? Here are a few suggestions that may help you answer that question:

  • Are you building a business-related app or a personal/social media type of application? Native mobile apps (programs designed specifically to run on Android or iOS devices) are generally the best bet if you're creating something that people will be using on their phone or tablet, especially if the app relies on features built into the device like the camera, GPS, biometric sensor, etc. However, web apps (optimized for mobile devices) are usually the best choice for any application that requires much data entry or analyzing detailed reports.
  • My choices for platforms that are focused on building web apps include:
    • Bubble.io - Bubble has a slightly higher learning curve than a lot of other platforms, but it allows you to build almost any type of application and it offers dozens of customizable templates. Bubble also has a wide range of tutorials contributed by users that show you how to build apps like Quora, Udemy, Medium, Quickbooks, and Instagram, among others. Hosting for your app is provided on Amazon Web Services (the leading cloud hosting provider).
    • Caspio.com - Caspio works a lot like Bubble, with the platform divided into sections for building your data tables, designing the user interface (for forms and reports), and creating workflows based on event "triggers". As for infrastructure, Caspio is hosted on AWS and uses Microsoft SQL Server as its internal database system. Also, Caspio refers to itself as a "low-code" platform, although you can definitely build apps without adding any code. In fact, both Bubble and Caspio do allow you to add some JavaScript code to perform certain actions, so in that sense they're both "low-code" platforms.
    • Zoho Creator - Zoho Creator is sort of a jack-of-all-trades. You can create desktop apps, native Android or iOS apps, or responsive mobile apps. It does have its own proprietary scripting language (Deluge) which you can use to provide customized features - that can be viewed as a plus or minus depending on your willingness to learn Deluge (although you can pick it up in a hurry). Zoho Creator also provides for a wide range of third party integrations, including all of the other Zoho products (Zoho Sheets, Zoho Books, Zoho CRM, etc.).
  • My choices for platforms you would use to build native mobile apps include:
    • Glide - If you need a fairly simple app and need it in a hurry, Glide is great option. There are a number of basic templates available that you can use as a starting point and end up with a nice-looking, easy-to-use app. Performance used to be a bit of a problem at times since Glide's only data source was Google Sheets - however, Glide now has its own database system in place, allowing for faster response time in storing and retrieving data.
    • Appy Pie - Appy Pie is similar to Glide in that you can whip together a native mobile app in a hurry (and get it published to the App Store or to Google Play Store). However, you're pretty much restricted to picking a template, adding some pre-designed pages, and picking a particular theme. Anything more complicated usually requires integration with a third party application.
    • GoodBarber - Building an app with GoodBarber often starts with selecting one of their 50 or more templates, the same as with Glide. However GoodBarber's templates are generally more polished, plus they provide more access to the native features on mobile devices - functions like beacons and geofences. In addition, you can build PWA's (Progressive Web Apps) with GoodBarber to provide users with an optimized mobile web application (which can also make use of certain native features of the mobile device, such as the camera or GPS).
  • AppGyver - If you're determined to find a platform where you can take your app's source code with you if you leave, AppGyver (which is now part of software giant SAP) is probably your only option. You can actually take an Android app created on AppGyver with you as an APK file that can be imported directly into Android Studio. There are also at least two other unique features about AppGyver: 1) The platform is free for anyone or any organization with annual revenue under 10 million dollars and 2) You can even make an app for TV. 
Please leave a comment or email me if you have a favorite platform that I failed to include here.

Thursday, July 22, 2021

Passing Parameters in Bubble

One of the most common actions in web design is passing information from one web page to another. If you're on a recipe website and ask for meals that use paprika, you'll probably see a new page load whose URL ends in something like "/?search=paprika". You pass parameters from one page in your Bubble app to another the same way. 

As an example, let's say you have an app that allows the user to enter the purchase price and projected lifetime of an asset (such as a lathe or drill press) and then try out different methods of depreciating that asset. Once the asset's purchase price and expected lifetime are entered, users can click a button to start a workflow in Bubble with the workflow's first (and only) action being to navigate to the page where they can select different depreciation scenarios. The action editor would look like this:


The destination is the page where the users select different depreciation methods ("depreciation"), the checkbox "Send more parameters to the page" is checked, and two parameters have been entered. Each parameter consists of a "key" and a value. In this case, the first key is named "q" (it could be any text string) and its value is the value of the asset purchase price. The second parameter's key is named "qq" and its value is the value of the asset lifetime. Any number of parameters can be passed to another page this way.

If the user entered a purchase price of $875 and an expected lifetime of 8 years, the tail end of the URL of the "depreciation" page would look like this:

/depreciation?debug_mode=true&q=875&qq=8

(notice our two parameters added onto the end of the URL)

There's nothing complicated about passing parameters like this in Bubble, but I read one explanation online that was a bit confusing, so I thought I would show a simple example of how it works. Hope this helps if anyone isn't clear about how to send data between pages.