Tuesday, August 31, 2021

No-Code Training Boot Camps

There are plenty of no-code training videos out there, but what if you learn best in a guided study environment? Many people focus better in a class setting, so where can you find a no-code programming camp? 

Here are a couple of options:

  • The Canvas No-Code Programming Bootcamp - This free bootcamp uses Bubble's no-code platform to teach students how to create even complex web apps. It's taught by Airdev, a major development firm, and allows you to proceed at your own pace, working through a series of videos and challenges that show you how to build a full web application in just a few weeks. You can also get help from the community if you get stuck.
The camp curriculum covers both Bubble and Canvas, a framework designed to work with Bubble (one of the most advanced visual programming platforms on the Internet). There are four sections to the course: Introduction to Bubble (2 hours), Bubble Basics (12 hours), Introduction to Canvas (22 hours), and API & Canvas (18 hours). You can find more information at: https://canvas.airdev.co/bootcamp

  • MVP Bootcamp by No Code MBA - An online bootcamp designed to teach you how to build, launch and make money with your no-code apps. The camp includes 6 weeks of training on building an app, along with advice on creating an effective business model. Time commitment is 2 hours per week of online instruction, and a recommended total of 4 hours of offline work per week. 

Also included are weekly meetings to discuss the progress of your project and to review videos and articles pertaining to monetizing your app. In addition you get lifetime access to the bootcamp community forum, along with personal assistance from your instructor and peers in your cohort.

Pricing is quoted as being $1990 although the first cohort to go through the bootcamp will only pay $750 (no information on whether or not a group has already been assembled for the first camp). For more details go to: 

https://www.nocode.mba/bootcamp-mvp#faqs

  •  MillionLabs.co.uk - MLabs offers an 8-week no-code camp focused on business applications and learning how to use Bubble (www.bubble.io). The course covers esponsive UI design, workflows, data architecture, advanced bubbling skills and integrating external services through plugins and APIs. Pricing for the bootcamp is listed as starting from $600. Get more information at: https://millionlabs.co.uk/bootcamp.
  • Bubble.io - Bubble offers their own bootcamps, providing 3 different types: A course in fundamentals (4 sessions in 4 weeks), a bootcamp on building and launching your own product idea (8 sessions in 8 weeks), and an advanced course for professionals (8 sessions in 8 weeks). Prices generally range from $600 to $800. Go to https://bubble.io/bootcamps for more information.

Tuesday, August 24, 2021

Nintendo's Visual Programming Game Building App

Nintendo just released "Game Builder Garage", a new app for anyone who wants to learn the basics of video game building. Game Builder Garage uses a visual programming language that lets you connect characters called “Nodon” that have different properties. By combining guided lessons with a variety of Nodon characters and objects Nintendo hopes to make game building as much fun as playing a video game.

The app has two different modes: Lesson mode and Free Programming mode. Lesson mode consists of interactive lessons that include puzzles and tasks for the user to solve. It's designed to introduce the beginner to the mechanics involved in video game design and to reinforce that material by means of the tasks that have to be completed. 

Free Programming mode allows users to design and create their own games. In this mode you can switch between programming and game playing just by pressing a button, so you can test your design and make changes quickly and easily. In addition, you can upload your game to a "game hub" where other learners can download it and you can download their games, play them and study how they were constructed.

Game Builder Garage is designed for the Nintendo Switch and is available in the Nintendo eShop for $29.99. To see a video of Game Builder Garage in action go to: https://www.nintendo.com/games/detail/game-builder-garage-switch/ 

Monday, August 16, 2021

OpenAI's Codex (Update)


OpenAi's new product Codex translates natural English language into ready-to-use program code, in any one of a number of different programming languages. Codex is currently in beta testing, but will eventually be available to everyone through an API. For a detailed look at how Codex works, go to:

https://www.youtube.com/watch?v=SGUCcjHTmGY

This video, put together by OpenAI's founders, covers the development of Codex and several examples of how it can be used in practice.

Thursday, August 12, 2021

Airtable Acquires Bayes.com

The hottest trends in no-code right now are workflow automation, machine learning, and data visualization. Airtable just made a move to bolster its ability to provide data analysis visualization options for Airtable users by buying Bayes.com. 

Bayes' system proactively recommends ways to look at and analyze your data as soon as you upload your data file. Many of those data visualization methods are reminiscent of what users try to do with Excel - which should fit right in with Airtable's spreadsheet format. 

With Bayes you also have the ability to easily export your charts and graphs to emails, Power Point presentations or other media. And you can also use your data visualizations in an interactive mode, allowing others to view and comment on each example.

It's probably going to take a little while for Airtable to get the new forms of data analysis integrated into their platform, but it should add an important feature to their system.

Wednesday, August 11, 2021

OpenAI Codex - A New Brand of No-Code

A few days ago OpenAI (www.openai.com) announced the release of a new version of OpenAI Codex, their Al system that translates natural language statements into program code. Now you can use English to describe what you want your software project to do, and OpenAl’s Al model will automatically generate the corresponding computer code, in whatever programming language you choose. 

OpenAI Codex is a "descendent" of GPT-3 (a neural network machine learning model) and is capable of producing code in Python, JavaScript, Go, Perl, PHP, Ruby, Swift and TypeScript. Codex is basically a version of GPT-3, but one that has been trained on program code instead of ordinary text material. That allows Codex to do things like complete lines of code or entire routines, but originally it wasn't truly a tool the non-coders could easily use.

That’s changed with the new API version, which interprets everyday requests like “make the ball bounce off the sides of the screen” or “download this data using the public API and sort it by date,” and creates working program code in any one of a dozen languages. Also, in addition to being "fluent" in a variety of coding languages, the fact that Codex is trained on pretty much all the public code on GitHub (and other repositories), means it’s aware of standard practices in coding. 

Because of its ability to handle Natural Language processing, Codex can be used to interact with other software, to perform functions within that software that aren't built into the application. OpenAI also says that Codex can convert or "translate" code from one programming language to another, providing code portability. 

Right now Codex is only available to a select group of beta testers. Once, its made available to a wider segment of the public I'll attempt to get a copy and work out an actual example or two. Until then, Codex may be a product to keep a close eye on. 

Friday, August 6, 2021

No-Code Data Science (Part 2)

In Part 1 of this post we covered no-code machine learning platforms in general. In this post I want to go through a few actual examples of how these types of platforms actually work. One thing that needs to be done regardless of the platform involved is to go through the input data (if it hasn't already been prepared for analysis) and get it in the proper format.

If you're supplying your own data the first step in building a machine learning model is to "clean" the input data that will be used to train the model. For example, let's say you have a voter file and want to make some predictions about voting patterns based on how long ago each person registered to vote. Some of the registration dates may be missing or incomplete (such as 9/22/0), some may be entered with dashes (5-14-1987) and others with slashes (5/14/1987), and some may obviously be incorrect (11-03-1847). The missing and incorrect entries need to be removed from the input data and the incomplete records may be removed or an attempt may be made to add the missing portion of the date. Once that's done, all the records need to formatted in the same way for  consistency. Data that's in some form other than numbers (such as images, sound clips, etc.) may require a different approach, but it will still require preparation of some type in order to use it to train a machine learning model.

At this point you're ready to choose a platform and begin building your ML model. These days there are quite a few no-code platforms to choose from - here are several examples:

  • Teachable Machine (https://teachablemachine.withgoogle.com/) - Google's Teachable Machine is a good place to start to get a feeling for how no-code AI platforms work. You can train models yourself to recognize certain images, sounds or poses (like whether you're sitting down or standing up), and host those models for free directly on the Teachable Machine website or in your own website or app. The models are built using TensorFlow.js, a JavaScript library for machine learning and are trained using a method called “transfer learning”. In transfer learning a fully trained neural network model is used, but the original training data is replace with your input. 

To see Teachable Machine in action search for: “Teachable Machine Tutorial: Bananameter” by Barron Webster (November 7, 2019). The tutorial shows  how he used Teachable Machine to identify images of bananas that aren't ripe yet, bananas that are ripe, and bananas that are past their prime. And for a more in-depth look at how Teachable Machine works, take a look at: “How to build a Teachable Machine with TensorFlow.js”, Nikhil Thorat – deeplearnjs.org.

  •  MonkeyLearn - MonkeyLearn is a no-code machine learning platform that specializes in text classification and extraction models which you can use to analyze your raw data in Microsoft Excel. To learn exactly how this works, there's a step-by-step explanation of the process involved at: https://monkeylearn.com/blog/machine-learning-in-excel/.
Scroll down the web page to the heading “How to use Machine Learning in Excel?” and follow the sequence of steps involved. The pre-trained model used is designed to do “sentiment analysis”. It analyzes customer comments uploaded from an Excel spreadsheet, predicts the main sentiment in each set of comments, and returns an Excel spreadsheet with a new column containing the model's predictions. There's also a section in the article covering how to create, train and use a “classification” model, as well as a link to a comprehensive guide to creating compelling data visualizations in Excel.

  • Peltarion - Peltarion has a detailed tutorial to help explain howdeep learning” models are built (https://peltarion.com/blog/applied-ai/defect-detection). The model is designed to detect surface defects in manufactured parts. Properly trained deep learning models are faster, more accurate and more consistent than manual inspections or software using traditional programming. Note: Deep learning is a subset of machine learning and deep learning models are normally trained with artificial neural network algorithms (algorithms that attempt to simulate the reasoning process of the human brain).

This particular model scans images of metal pump impellers and classifies them as either defective or not defective, which makes this a binary image classification problem. The details of how a deep learning model works can be complicated but you don't need to understand all the details in order to build a model like this. Rather than going through the whole process, I'll just list the steps involved:
    • Create a new project and give it a name.
    • Click the Data library button and search for the “Defects in metal casting” dataset.
    • If you agree with the dataset license, click “Accept” and import the data.
    • Next, you need to pre-process the data, which is split into 80% training data, 10% evaluation data (used to test the model on unlabeled data at the end of eachepoch” or training cycle to see how well the training is progressing), and 10% of the data which is saved for any further testing that may be needed.
    • Click the wrench icon to display theFeature Settings” and make sure “Encoding” is set to “binary” (since the model only has two outcomes - “true”, the part is defective and “false”, the part isn't defective). Also make sure “Positive class” is set to “true”.
    • Click “Save version”, then click “Use in new experiment”.
    • Note: The pre-processing is complete – the next step is to build the model.
    • Click Input(s)/Target” in the Experiment Wizard and make sure the Input feature is set to “Image” and the target is set to “Defective”.
    • Click the Snippet” tab and select “EfficientNet B0”.
    • Click the Weights” tab. The EfficientNet snippet can be trained with pre-trained weights (which allows you to make use of knowledge gained from another dataset). Using pre-trained weights saves time and improves efficiency.
    • Click “Create” to move to the Modeling View”.
    • Click on the “Input” block, select “Image augmentation” and click “Natural images”. Image augmentation adds more variation to the images – it isn't necessary, but it may help to increase the model's accuracy.
    • Click “Run” to start the experiment running.
    • Once the experiment has run to completion, click Evaluation” at the top of the screen to view the results.
    • There are a number of “loss metrics” in the Evaluation view, but the primary ones for this experiment are “binary accuracy” (the percentage of correct predictions) and “recall” (the percentage of actual defective parts that were classified as defective by the model).
    • To inspect the model's predictions, select the subset and epoch (training cycle) you want to look at and click “Inspect”.
    • There are four possibilities for each prediction the model makes – a non-defective impeller is classified correctly as not defective (“false”), a non-defective impeller is classified incorrectly as “defective” (“true”), a defective impeller is classified correctly as defective (“true”), or a defective impeller is classified incorrectly as not defective (“false”).
    • Looking at theConfusion Matrix” will show the percentages of each outcome (true positive, true negative, false positive, and false negative). Those figures will tell you a lot about how well the model did, but the most important statistic is the number of false negatives (which results in sending out a defective part). Another important metric is the ROC (Receiver Operating Characteristic) curve, which plots true positive rates versus false positive rates.

Once the model has been trained, the final step is to deploy it and see how it performs on new data that it hasn't seen before. Here's how to deploy the model:

    • Download and unzip the test data (the portion of the data that wasn't used to train the model).
    • Go to theDeployment view” and click “New deployment”.
    • When the “Create deployment” screen displays, select the experiment you want to deploy and choose the epoch (training cycle) that had the best results to be your checkpoint.
    • Click the “Enable” button to deploy the experiment.
    • Click the “Test deployment” button to try the model out on the test data you downloaded.

If there's a problem with the results of the deployment test, you can review the different reports from the training run and see if you want to make changes in the project setup and rerun the experiment.

That's a quick look at how no-code AI platforms work. Since this is one of the fastest growing parts of the no-code landscape I plan on doing several more posts on this topic in the near future. In the meantime, there are many other platforms like the ones mentioned above - if you're interested in machine learning, by all means pick one and try building a few models yourself. 


Tuesday, August 3, 2021

No-Code Data Science


The fastest-growing area in programming these days is data analysis. It has become a necessity for businesses of all sizes in order to determine mission critical information such as trends in the marketplace, ways to reduce customer churn, and the ability to pinpoint product line strengths and weaknesses. Until recently only large organizations could afford the services of data analysts, specialists who could construct the artificial intelligence models needed to uncover that information. Now, there are an increasing number of "no-code" platforms that allow anyone to quickly build an accurate machine learning model - without the need for coding or training in data science.

As machine learning models become easier to create, they're also coming into play in areas other than the business world. ML models have been used to make predictions ranging from where and when wildfires are most likely to occur to identifying optimal Covid-19 testing and vaccination strategies. In fact, as time goes on and machine learning (as well as deep learning) models become more sophisticated, no-code AI platforms are bound to be used for data analysis in more and more areas.

Popular no-code machine learning platforms include Teachable Machine (by Google), AutoML (part of Google Cloud), Peltarion, Big ML, Obviously AI, CreateML (for Mac computers), Google ML Kit (to generated models for Android and iOS devices), and MonkeyLearn. Various studies have indicated that platforms like these have the potential to reduce development time for machine learning models by up to 90 percent. They also allow the end users (the people who are the actual decision makers) to construct the exact model they need. 

Along with empowering "citizen developers", no-code AI platforms also provide a couple of other major benefits. They allow experienced data scientists to focus their time and effort on more complex projects which may also lead to more effective no-code solutions. And they make it possible to try out AI solutions much faster and with much less expense involved than with traditional methods. Building models quickly and cheaply can greatly increase the chance of finding a really useful solution to a particular problem, since it's not always clear what type of algorithm would work best.

In some cases a no-code platform will require you to select an algorithm or type of model to fit with your data, select certain training parameters, and then "train" the model on your dataset. Other platforms may simply have you pick one of their pre-trained models and use it to analyze your dataset. In either case the process is done without requiring any coding. Training a model on your own data takes longer, but it can result in a more accurate result. Using a pre-trained model usually works best when you're dealing with a common machine learning problem - like predicting customer churn - where there are standard models available.

The general steps in training a machine learning model include:

  • Choosing a dataset from the platform's data store or uploading your own file.
  • "Cleaning" the input data - removing any empty or outlier items and possibly "normalizing" the data to get everything on roughly the same scale.
  • Choosing a type of algorithm (such as a classification or regression algorithm) to use to train a model on your data. Each algorithm has certain settings that can be adjusted to produce the best fit with your data. The platform may analyze the data and automatically choose what it considers the optimum settings or you may be able to adjust some of the settings yourself.
  • Using the majority of the input data to train the model and a small portion for validation to see if the model produces accurate results.
  • Deploying the finished model to a website or as part of an app.
That covers the general characteristics of a no-code machine learning platform. In Part 2 of this post I'll go over a few specific examples of creating a machine learning model and what results to expect from that model.