SparkFun JetBot AI kit – monteringsinstruktioner

Assembly Guide for SparkFun JetBot AI Kit

Introduction

SparkFun’s version of the JetBot merges the industry leading machine learning capabilities of the NVIDIA Jetson Nano with the vast SparkFun ecosystem of sensors and accessories. Packaged as a ready to assemble robotics platform, the SparkFun JetBot Kit requires no additional components or 3D printing to get started – just assemble the robot, boot up the Jetson Nano, connect to WiFi and start using the JetBot immediately. This combination of advanced technologies in a ready-to-assemble package makes the SparkFun JetBot Kit a standout, delivering one of the strongest robotics platforms on the market. This guide serves as hardware assembly instructions for the two kits that SparkFun sells; Jetbot including Jetson Nano & the Jetbot add-on kit without the NVIDIA Jetson Nano. The SparkFun JetBot comes with a pre-flashed micro SD card image that includes the Nvidia JetBot base image with additional installations of the SparkFun Qwiic Python library, Edimax WiFi driver, Amazon Greengrass, and the JetBot ROS. Users only need to plug in the SD card and set up the WiFi connection to get started.

Completed SparkFun Jetbot

Note: We recommend that you read all of the directions first, before building your Jetbot. However, we empathize if you are just here for the pictures & a general feel for the SparkFun Jetbot. We are also those people who on occasion void warranties & recycle unopened instructions manuals. However, SparkFun can only provide support for the instructions laid out in the following pages.

Attention: The SD card in this kit comes pre-flashed to work with our hardware and has the all the modules installed (including the sample machine learning models needed for the collision avoidance and object following examples). The only software procedures needed to get your Jetbot running are steps 2-4 from the Nvidia instructions (i.e. setup the WiFi connection and then connect to the Jetbot using a browser). Please DO NOT format or flash a new image on the SD card; otherwise, you will need to flash our image back onto the card.

If you accidentally make this mistake, don’t worry. You can find instructions for re-flashing our image back onto the SD card in the software section of the guide

The Jetson Nano Developer Kit offers extensibility through an industry standard GPIO header and associated programming capabilities like the Jetson GPIO Python library. Building off this capability, the SparkFun kit includes the SparkFun Qwiic pHat for Raspberry Pi, enabling immediate access to the extensive SparkFun Qwiic ecosystem from within the Jetson Nano environment, which makes it easy to integrate more than 30 sensors (no soldering and daisy-chainable).


The SparkFun Qwiic Connect System is an ecosystem of I2C sensors, actuators, shields and cables that make prototyping faster and less prone to error. All Qwiic-enabled boards use a common 1mm pitch, 4-pin JST connector. This reduces the amount of required PCB space, and polarized connections mean you can’t hook it up wrong.


Materials

SparkFun Jetbot parts

The SparkFun Jetbot Kit contains the following pieces; roughly top to bottom, left to right.

PartQty
Circular Robotics Chassis Kit (Two-Layer)1
Lithium Ion Battery Pack – 10Ah (3A/1A USB Ports)1
Ball Caster Metal – 3/8″1
Edimax 2-in-1 WiFi and Bluetooth 4.0 Adapter1
Header – male – PTH – 40 pin – straight1
2 in – 22 gauge solid core hookup wire (red)1
Shadow Chassis Motor (pair)1
Jetson Dev Kit (Optional)1
SparkFun JetBot Acrylic Mounting Plate1
SparkFun Jetbot image (Pre Flashed)1
Leopard Imaging 145 FOV Camera1
Screw Terminals 2.54mm Pitch (2-Pin)2
SparkFun Micro OLED Breakout (Qwiic)1
SparkFun microB USB Breakout1
SparkFun Serial Controlled Motor Driver1
Breadboard Mini Self-Adhesive Red1
SparkFun Qwiic HAT for Raspberry Pi1
SparkFun JetBot Acrylic sidewall for camera mount2
SparkFun JetBot Acrylic Camera mount & 4x nylon mounting hardware1
Qwiic Cable – 100mm1
Qwiic Cable – Female Jumper (4-pin)1
Wheels & Tires – included as part of circular robotics chassis2
USB Micro-B Cable – 6″2
Dual Lock Velcro1
SparkFun Jetbot included hardware

The SparkFun Jetbot Kit contains the following hardware; roughly top to bottom, left to right.

PartQty
Hex Standoff #4-40 Alum 2-3/8″3
Standoff – Nylon (4-40; 3/8in.)10
1/4″ Phillips Screw with 4-40 Thread20
Machine Screw Nut – 4-4010
Circular Robotics Chassis Kit (Two-Layer) Hardware1

Recommended Tools

We did not include any tools in this kit because if you are like us you are looking for an excuse to use the tools you have more than needing new tools to work on your projects. That said, the following tools will be required to assemble your SparkFun Jetbot.

  • Small phillips & small flat head head screwdriver will be needed for chassis assembly & to tighten the screw terminal connections for each motor. We reccomend the Pocket Screwdriver Set; TOL-12268.
  • Pair of scissors will be needed to cut the adhesive Dual Lock Velcro strap to desired size; recommended, but not essential..
  • Soldering kit for assembly & configuration of the SparkFun Serial Controlled Motor Driver – example TOL-14681
  • Optional– adjustable wrench or pliers to hold small components (nuts & standoffs) in place while tightening screws; your finger grip is usually enough to hold these in place while tightening screws & helps to ensure nothing is over tightened.A Note About Directions

When we talk about the ”front,” or ”forward” of the JetBot, we are referring to direction the camera is pointed when the Jetbot is fully assembled. ”Left” and ”Right” will be from the perspective of the SparkFun Jetbot.

Front of Jetbot

1. Circular Robotics Chassis Kit (Two-Layer) Assembly

If you prefer to follow along with a video, check out this feature from the chassis product page. You do not need to use the included ball caster as a larger option has been provided for smoother operation.

Start by attaching the chassis motor mount tabs to each of the ”Shadow Chassis Motors (pair)” using the long threaded machine screws & nuts included with the Circular Robotics Chassis Kit.

Hobby motor and mount

Fit the rubber wheels onto the hubs, install the wheel onto each motor, & fix them into position using the self tapping screws included with the Circular Robotics Chassis Kit.

hobby motor with wheel

Install the brass colored standoffs included with the Circular Robotics Chassis Kit; two in the rear and one in the front. The rear of the SparkFun Jetbot will be on the side of the plate with the two ”+” sign cut outs. The rear of the motor will be opposite the wheel where the spindle extends. This orientation ensures the widest base & most stable set up for your Jetbot.

motor mount to plate

The motor mounts fit into two mirrored inlets in each base plate as shown. Install the motors opposite of one another.

install standoffs

Depending on how you install the motor mounts to each motor will dictate how the motor can be installed on the base plate. Note: Do not worry about the motor orientation as you will determine proper motor operation in how you connect the motor leads to the SparkFun Serial Controlled Motor Driver. Notice how in the picture below one motor has the label facing up, while the other has the label facing down.

both motors on plate and standoffs

Place the other circular robot chassis plate on top of and align the two ”+” and the motor mount tab recesses. Hold the sandwiched chassis together with one hand and install the remaining Phillips head screws included with the Circular Robotics Chassis Kit through the top plate & into the threaded standoffs.

install top plate

Your main chassis is now assembled! The Circular Robotics Chassis Kit also contains a very small caster wheel assembly, but we have included a larger metal caster ball to increase the stability of the SparkFun Jetbot. We will cover the installation of this caster ball later in the tutorial.

screw install top plate

Utilize three of the included 1/4 in 4-40 Phillips Screws through the top chassis plate with threads facing up & install the 2-3/8 in #4-40 Aluminum Hex Standoff until they are finger tight.

install tall standoffs

The aluminum stand offs should be pointing up as shown below.

Standoff installed

The SparkFun JetBot acrylic mounting plate is designed to have two of these aluminum standoffs in the front & one in the rear. We recommend the rear standoff on the left side of the chassis (as shown) so the 6 in microB usb cables that will be installed later can more easily span the gap needed to power the JetBot.

all standoffs installed

Un-package the 3/8 in Metal Caster Ball and thread the mounting screws through all pieces as shown. Note the full stack height will help balance the Jetbot in a stable position.

caster wheel assembly
more caster wheel assembly

Install the caster wheel using the Phillips head screws and nuts included with the 3/8 in caster ball assembly. The holes on the caster assembly are spaced to fit snug on the innermost segment of the angular slots near the rear of the lower plate on the JetBot chassis. Again, hand tight is just fine. Note: if you over tighten these screws it will prevent the ball from easily rotating in the plastic assembly. However, too loose and it may un-thread; go for what feels right

caster install to chassis

After you have installed the caster & aluminum standoffs, thread the motor wires through the back of the chassis standoffs for use later.

Completed chassis

2. Camera Assembly & Installation

Unpackage the Leopard Imaging camera & align the four holes in the acrylic mounting plate with those on the camera.

Note: ensure that the ribbon cable is extending over the acrylic plate on the edge that does not have mounting holes near the edge; as shown below.

Place all four nylon flathead screws through the camera & acrylic mounting plate prior to fully tightening the nylon nuts. This will ensure equal alignment across all four screws. Tighten the screws while holding the nuts with finger pressure in a rotating criss cross pattern; similar to how you tighten lug nuts on a car rim.

Camera to mount

Align one acrylic sidewall with the camera mounting plate as shown below ensuring that the widest section of the sidewall is oriented to the top of the camera mount where the ribbon cable extends.

camera sidewall install

Apply even pressure on each piece until they fit together. Note: these pieces are designed to have an interference fit and will have a nice, satisfying ”click” when they fit together.

camera sidewall complete

Repeat this process on the other side to fully assemble the camera mount.

fully assembled camera

The camera mount should now be installed to the SparkFun Jetbot acrylic mounting plate using the overlapping groove joints. Ensure that the cut out on the acrylic mounting plate is facing towards the front/right of the Jetbot as shown. This will ensure that there is plenty of room for the camera ribbon cable to pass around the assembly and up to the Jetson nano camera connector.

camera mount to plate

Install four of the nylon standoffs to the top of the SparkFun Jetbot acrylic mounting plate using four of the included 1/4 in 4-40 Phillips head screws as shown below.

Jetson Nano standoffs to plate
all standoffs on plate

Utilize three more of the 1/4 in 4-40 Phillips head screws to install the SparkFun Jetbot acrylic mounting plate to the aluminum standoffs extending from the Two-layer circular robotics chassis as shown below.

install camera plate to chassis

3. Motor Driver Assembly & Configuration

To get started, make sure that you are familiar with the SparkFun Serial Controlled Motor Driver Hookup Guide.

We also recommend a detailed review of the Hardware Overview of the SparkFun Serial Controlled Motor Driver Here.

Annotated front SparkFun Serial Controlled Motor Driver

You will need to solder both triple jumpers labeled below as ”I2C pull-up enable jumpers” as the SparkFun pHat utilizes the I2C protocol. The default I2C address that is used by the pre-flashed SparkFun Jetbot image is 0x5D which is equavalent to soldering pad #3 noted as ”configuration bits” on the back of the SparkFun serial controlled motor driver; see below. You will need to create a solder jumper on pad #3 only for the SparkFun Jetbot Image to work properly.

Annotated rear SparkFun Serial Controlled Motor Driver

Layout of jumpers on the Serial Controlled Motor Driver.

Properly Jumpered SCMD

Jumper 3 of theConfiguration Bitsproperly soldered.

Your completed Serial Controlled motor drive should look somewhat similar to the board shown below.

  • The 2-pin screw terminals are soldered to the ”Motor Connections.”
  • Break off 4 Male PTH straight headers and solder into the ”Power (VIN) connection” points.
  • Break off 5 Male PTH straight headers and solder into the ”Expansion port” points. These will not be used, but will provide additional board stability when installed into the mini breadboard.
  • Break off 5 Male PTH straight headers and solder into the ”User port” points for connection into the included Female Jumper Qwiic cables.
completed motor driver
completed motor driver 2

Break off 5 Male PTH straight headers and solder into the breakout points on the SparkFun microB USB Breakout.

Install both the SparkFun Serial Controlled Motor Driver & the SparkFun microB Breakout board on the included mini breadboard so the ”GRD” terminals for each unit share a bridge on one side of the breadboard.

Utilize the included 2 in – 22 gauge solid core hookup wire (red) to bridge the ”VCC” pin for the SparkFun microB Breakout to either (VIN) connection point on the SparkFun Serial Controlled Motor Driver as shown below.

motor driver and usb to breadboard

Required power connections between the micro-USB breakout and the Serial Controlled Motor Driver.

motor driver and usb to breadboard

Competed assembly of the micro-USB breakout and Serial Controlled Motor Driver on the breadboard.

Utilize a small flat head screwdriver to loosen the four connection points on the screw terminals. When inserting the motor connection wires, note the desired output given the caution noted in section #1 of this assembly guide.

Note from section #1: Do not worry about the motor orientation as you will determine proper motor operation in how you connect the motor leads to the SparkFun Serial Controlled Motor Driver.

These connection points can be corrected when testing the robot functionality. If your Jetbot goes straight when you expect Jetbot to turn or vice versa, your leads need to be corrected.

motor cables to motor driver

Set this assembly aside for full installation later.

4. Accessory Installation to Main Chassis

Align the mounting holes on the SparkFun Micro OLED (Qwiic) with those on the back of the SparkFun Jetbot acrylic mounting plate. Install the Micro OLED using two 1/4 in 4-40 Phillips head screws and two 4-40 machine screw nuts.

qwiic oled to chassis
completed qwiic oled to chassis

Thread the ribbon cable of the Leopard imaging camera back through the acrylic mounting plate and half-helix towards the left side of the Jetbot.

camera ribbon cable threading

Install the Jetson Nano Dev kit to the nylon standoffs using four 1/4 in 4-40 Phillips head screws. Tighten each screw slightly in a criss-cross pattern to ensure the through holes do not bind during install until finger tight. Make sure you can still access the camera ribbon cable.

Jetson Nano install

Note: the camera connector is made from small plastic components & can break easier than you think. Please be careful with this next step.

Loosen the camera connector with a fingernail or small flathead screwdriver. Fit the ribbon cable into this connector and depress the plastic press fit piece of the connector to hold the ribbon cable in place.

camera attachment to Jetson Nano

Unpackage & install the USB Wifi adaptor into one of the USB ports on the Jetson nano Dev Kit. The drivers for this Wifi adaptor are pre-installed on the SparkFun Jetbot image. If you are making your own image, you will need to ensure you get these from Edimax.

USB wifi install to Jetson Nano

Align the SparkFun pHat with the GPIO headers on the Jetson Nano Dev Kit so that the pHat overhangs the right hand side of the Jetbot. For additional information on hardware assembly of the SparkFun pHat, please reference the hookup guide here.

Note: The heatsink on the Jetson Nano Dev Kit will only allow for one orientation of the SparkFun pHat.

PHat installation

Wrap the motor wires around the rear/left standoff to take up some of the slack; one or two passes should do. Peel the cover off the self adhesive backing on the mini breadboard you set aside at the end of section #3.

breadboard installation to chassis

Place the breadboard near the back of the Jetbot Acrylic mounting plate where there is good adhesion & access to all the components. Attach the (4-pin) Female Jumper Qwiic cable to the SparkFun Serial Controlled Motor Driver pins as shown. Yellow to ”SCL,” Blue to ”SDA,” Black to ”GND.”

breadboard placement on chassis and qwiic cable to motor driver

Daisy chain the polarized Qwiic connector on the other end of the (4-pin) Female Jumper Qwiic cable into the back of the SparkFun Micro OLED (Qwiic).

Qwiic cable installation

Using the 100mm Qwiic Cable attach the SparkFun Micro OLED front Qwiic connector to the SparkFun pHat as shown.

Qwiic install to PHat board

Cut the Dual Lock Velcro into two pieces and align them on the 10Ah battery & top plate of the Two-Layer Circular Robotics Chassis as shown below. Ensure that the USB ports on the battery pack are pointing out the back of the Jetbot. Additionally, the orange port (3A) will need to power the Jetson Nano Dev Kit & therefore will need to be on the right side of the Jetbot.

battery pack Velcro placement
Note high amp usb socket

Apply firm pressure to the battery pack to attach to the Jetbot chassis via the Dual Lock Velcro.

battery pack installation

Remove the micro SD card from the SD card adapter.

micro SD card

Insert the micro SD card facing down into the micro SD card slot on the front of the Jetson Nano Dev Kit. Please see the next three pictures for additional details.

install image into SD card slot on Jetson
Card in SD slot
Card in SD clost underview

The USB ports on the back of the 10Ah battery pack has two differently colored ports. The black port (1A) is used to power the motor driver via the SparkFun microB breakout. Utilize one of the 6 in micro-B USB cables to supply power to the microB breakout.

USB power motor drivers low amp
Note high amp usb socket

Note: Once you plug the Jetson Nano Dev Kit into the 3A power port, this will ”Boot Jetson Nano” which is not covered in detail until the links in section #5 of this assembly guide. Do not proceed unless you are ready to move forward with the software setup & examples provided by NVIDIA.

The orange port (3A) is used to power the Jetson Nano Dev Kit. Utilize the remaining 6 in micro-B USB cable to supply power to the Jetson Nano Dev Kit.

Final usb cable install

Congratulations! You have fully assembled your SparkFun JetBot AI Kit!

5. Software Setup Guide from NVIDIA

Attention: The SD card in this kit comes pre-flashed to work with our hardware and has the all the modules installed (including the sample machine learning models needed for the collision avoidance and object following examples). The only software procedures needed to get your Jetbot running are steps 2-4 from the Nvidia instructions (i.e. setup the WiFi connection and then connect to the Jetbot using a browser). Please DO NOT format or flash a new image on the SD card; otherwise, you will need to flash our image back onto the card (instructions below).

Your SparkFun Jetbot comes with a Pre-Flashed micro SD card. Users only need to plug in the SD card and set up the WiFi connection to get started.

  • The default password on everything (i.e. login/user, jupyter notebook, and superuser) is ”jetbot”.
  • We recommend that users change their passwords after initial setup. These are typically covered on the first boot of your Jetson Nano as detailed in the NVIDIA Getting Started with Jetson Nano walkthrough

Software Setup

The only steps needed to get your Jetbot kit up and running is to log into the Jetbot and setup your WiFi connection. Once that is done, you are now ready to connect to the Jetbot wirelessly. If you need instructions for doing so, you can use the link below.However, please take note of our instructions below. You will want to skip steps 1 and 5 to avoid erasing the image on the card or undoing the hardware configuration.NVIDIA JETBOT WIKI SOFTWARE SETUP

Instructions

  1. Skip step 1 of Nvidia’s instructions: It references how to flash your SD card, so feel free to skip to Step 2 – Boot Jetson Nano.

Note: Following Step 1 will erase the pre-flashed image and make a lot of extra work for yourself.

  1. Skip step 5 of Nvidia’s instructions: This step should already be setup on the pre-flashed SD card.

Get and install the latest JetBot repository from GitHub by entering the following commands

COPY CODEgit clone https://github.com/NVIDIA-AI-IOT/jetbot
cd jetbot
sudo python3 setup.py install

Note:Running sudo python3 setup.py install in the command line will overwrite the software modifications for SparkFun’s hardware in the kit.

Troubleshooting

In the event that you accidentally missed the instructions above, here are instructions to get back on track.

Re-Flashing the SD card

If you need to re-flash your SD card, follow the instructions from Step 1 Nvidia’s guide. However, download and use our image instead (click link below).DOWNLOAD SPARKFUN’S JETBOT IMAGENote: Don’t forget to uncompress (i.e. unzip, extract, or expand) the file from the .zip file/folder first. You should be pointing the ”flashing” software to an ~62GB .img file to flash the image (sparkfun_jetbot_v01-00.img) onto the SD card.

Alternatively, there are other options for flashing images onto an SD card. If you have a preferred method, feel free to use the option you are most comfortable with.

Re-Applying the Software Modifications

If you have accidentally, overwritten the software modifications for the hardware included in your kit, you will need to repeat Step 5 from Nvidia’s guide from the desktop interface (if you are comfortable performing the following steps from the command line, feel free to do so).

Skip steps 1 and 2: Plug in a keyboard, mouse, and monitor. Then log in to the desktop interface (if you haven’t changed your password, the default password is: jetbot).

Follow step 3: Launch the terminal. There is an icon on sidebar on the left hand side. Otherwise, you can use the keyboard short cut (Ctrl + Alt + T).

Follow step 4: However, before you execute sudo python3 setup.py install you will want to copy in our file modifications to the jetbot directory you are in.

  1. Begin by downloading our files (click link below).

DOWNLOAD MODIFICATION FILES

  1. Next, extract the file.
  2. Next, replace the files in the jetbot folder. The file paths must be the same, so make sure to overwrite files exactly.

Click on the icon that looks like a filing cabinet on the left hand side of the GUI. This is your Home directory. From here, you will need to proceed into the jetbot folder. There you will find a jetbot folder with similar files to the ones you just extracted. Delete the folder and copy in our files (you can also just overwrite the files as well).

  1. Now, you can execute sudo python3 setup.py install in the terminal.

Follow step 5: Finish up by following step 5. Now you are back on track to getting your Jetbot running again!

6. Examples

The ”object following” jupyter notebook example won’t work due to the required dependencies that had not been released by NVIDIA prior to the creation of the SparkFun JetBot image. These updates can be manually installed on your Jetson Nano with the JetPack 4.2.1 release.

Update: The engine generated for the example utilized a previous version of TensorRT and is therefore, not compatible with the latest release. For more details on this issue, check out the following GitHub issue.NVIDIA JETBOT WIKI EXAMPLES

Resources and Going Further

Now that you’ve successfully got your JetBot AI up and running, it’s time to incorporate it into your own project!

For more information, check out the resources below:

Need some inspiration for your next project? Check out some of these related tutorials:

Easy Driver Hook-up Guide

Get started using the SparkFun Easy Driver for those project that need a little motion.

Servo Trigger Hookup Guide

How to use the SparkFun Servo Trigger to control a vast array of Servo Motors, without any programming!

SparkFun 5V/1A LiPo Charger/Booster Hookup Guide

This tutorial shows you how to hook up and use the SparkFun 5V/1A LiPo Charger/Booster circuit.

Wireless Remote Control with micro:bit

In this tutorial, we will utilize the MakeCode radio blocks to have the one micro:bit transmit a signal to a receiving micro:bit on the same channel. Eventually, we will control a micro:bot wirelessly using parts from the arcade:kit!

Webinar om Nvidia Jetson Nano

Join us for this engaging, informative webinar and live Q&A session to find out more about the hardware and software behind Jetson Nano. See how you can create and deploy your own deep learning models along with building autonomous robots and smart devices powered by AI. Find more resources for Jetson Nano at https://developer.nvidia.com/embedded…


Hello, AI World. Meet Jetson Nano (1:20:13)

Regressionsanalys med TensorFlow och Keras

Man behöver ofta lösa regressionsproblem när man tränar sina modeller för maskininlärning. I detta avsnitt av Coding TensorFlow diskuterar Robert Crowe hur man bygger och tränar en TensorFlow-modell med Keras, där du försöker hitta modellen som löser ett enda numeriskt resultat, med andra ord regression.
Lär dig hur du kommer igång med regressionsproblem genom ett exempel där AI-modellen förutser en bils bränsleförbrukning i miles per gallon. Detta kräver att vår modell undersöker och lär sig av de data vi tillhandahåller för att förutsäga vårt slutliga nummer.


Get started with using TensorFlow to solve for regression problems (Coding TensorFlow) (11:38)

Get the Colab & follow along here → http://bit.ly/2xV8rVg
UCI dataset repository → http://bit.ly/2k2xH8i
Watch more Coding TensorFlow → https://bit.ly/Coding-TensorFlow

Neural Network Regression Model with Keras | Keras #3

I den här videon användes både en linjär och icke-linjär regressionsmodell för att förutsäga antalet visningar på en youtube-video baserat på den videons ”likes”, ”dislikes” och prenumeranter (en webcrawler användes för att samla in denna statistik). Modellerna är Neural Networks, och de implementeras med Keras API och Tensorflow-backend.
I videon får du veta saker som vad regression är, hur man ställer in saker i Jupyter Notebook, träna-testa-dela, valideringsdelning, skalning / normalisering av data och när det är bra att göra det, batchstorlek, Stochastic Gradient Descent (SGD), Adam, epoker, iterationer, inlärningshastigheter, r2 (r ^ 2) poäng och mer.

Neural Network Regression Model with Keras | Keras #3 (19:04)

Introduktion till maskininlärning med TensorFlow 2.0

Maskininlärning (Machine Learning, ML) representerar ett nytt paradigm i programmering, där du istället för att programmera explicita regler på ett språk som Java eller C ++, bygger ett system som tränas och lärs upp på data från ett stort antal exempel, för att sedan kunna dra slutsatser av ny data baserat på de mönster som identifierats utifrån träningsdatat.
Men hur ser ML egentligen ut? I del ett av Machine Learning Zero to Hero går AI-evangelisten Laurence Moroney (lmoroney @) genom ett grundläggande Hello World-exempel på hur man bygger en ML-modell och introducerar idéer som vi kommer att tillämpa i det senare avsnittet om datorseende (Computer Vision) längre ner på denna sida.
Vill du ha en lite mer omfattande introduktion rekommenderar jag Introduction to TensorFlow 2.0: Easier for beginners, and more powerful for experts.

(40:55)

Intro to Machine Learning (ML Zero to Hero, part 1)

Prova själv den här koden i Hello World of Machine Learning: https://goo.gle/2Zp2ZF3

Basic Computer Vision with ML (ML Zero to Hero, part 2)

I del två av Machine Learning Zero to Hero går AI-evengalisten Laurence Moroney (lmoroney @) genom grundläggande datorseende (Computer Vision) med maskininlärning genom att lära en dator hur man ser och känner igen olika objekt (Object Recognition).

Fashion MNIST – ett dataset med bilder på kläder för benchmarking

Fashion-MNIST är ett forskningsprojekt av Kashif Rasul & Han Xiao i form av ett dataset av Zalandos artikelbilder. Det består av ett träningsset med 60 000 bildexempel och en testuppsättning med 10 000 exempel. Varje exempel är en 28 × 28 pixlar stor gråskalabild, associerad med en etikett från 10 klasser (klädkategorier).
Fashion-MNIST är avsett att fungera som en direkt drop-in-ersättning av det ursprungliga MNIST-datasättet för benchmarking av maskininlärningsalgoritmer.

fashion-mnist-sprite

Fashion MNIST dataset

Varför är detta av intresse för det vetenskapliga samfundet?

Det ursprungliga MNIST-datasättet innehåller många handskrivna siffror. Människor från AI / ML / Data Science community älskar detta dataset och använder det som ett riktmärke för att validera sina algoritmer. Faktum är att MNIST ofta är det första datasetet de provar på. ”Om det inte fungerar på MNIST, fungerar det inte alls”, sägs det. ”Tja, men om det fungerar på MNIST, kan det fortfarande misslyckas med andra.”

MNIST Dataset för nummerklassificering

Fashion-MNIST är avsett att tjäna som en direkt drop-in ersättning för det ursprungliga MNIST-datasetet för att benchmarka maskininlärningsalgoritmer, eftersom det delar samma bildstorlek och strukturen för tränings- och testdelningar.

Varför ska man ersätta MNIST med Fashion MNIST? Här är några goda skäl:

GitHub:

Find detailed information and the data set on GitHub

Här är ett exempel på datorseende som du kan testa själv: https://goo.gle/34cHkDk

Se mer om att koda TensorFlow → https://bit.ly/Coding-TensorFlow
Prenumerera på TensorFlow-kanalen → http://bit.ly/2ZtOqA3

Introducing convolutional neural networks (ML Zero to Hero, part 3)

I del tre av Machine Learning Zero to Hero diskuterar AI-evangelisten Laurence Moroney (lmoroney @) CNN-nätverk (Convolutional Neural Networks) och varför de är så kraftfulla i datorseende-scenarier. En ”convolution” är ett filter som passerar över en bild, bearbetar den och extraherar funktioner eller vissa kännetecken (features) i bilden. I den här videon ser du hur de fungerar genom att bearbeta en bild för att se om du kan hitta specifika kännetecken (features) i bilden.

Codelab: Introduktion till invändningar → http://bit.ly/2lGoC5f


Introducing convolutional neural networks (ML Zero to Hero, part 3)

Build an image classifier (ML Zero to Hero, part 4)

I del fyra av Machine Learning Zero to Hero diskuterar AI-evangelisten Laurence Moroney (lmoroney @) byggandet av en bildklassificerare för sten, sax och påse. I avsnitt ett visade vi ett scenario med sten, sax och påse, och diskuterade hur svårt det kan vara att skriva kod för att upptäcka och klassificera dessa. I de efterföljande avsnitten har vi lärt oss hur man bygger neurala nätverk för att upptäcka mönster av pixlarna i bilderna, att klassificera dem, och att upptäcka vissa kännetecken (features) med hjälp av bildklassificeringssystem med ett CNN-nätverk (Convolutional Neural Network). I det här avsnittet har vi lagt all information från de tre första delarna av serien i en.

Colab anteckningsbok: http://bit.ly/2lXXdw5
Rock, papper, saxdatasätt: http://bit.ly/2kbV92O


Build an image classifier (ML Zero to Hero, part 4)

Vad är neurala nätverk och maskininlärning?

Det verkar som om allt drivs av AI nuförtiden. Det handlar dock sällan om speciellt intelligenta system, eller riktig Artificiell Intelligens. AI används mest som en marknadsföringsterm, för det som oftast är maskininlärning (Machine Learning – ML) och tekniker som Neural Networks (NN). Dessa termer kan verka lite skrämmande och svåra, men de är faktiskt inte så komplexa som du kanske tror.

I följande filmklipp ges en enkel och tydlig förklaring till vad neurala nätverk och maskininlärning är, hur det fungerar och vad vi kan använda det till.

Designa adaptiva intelligenta användargränssnitt

Vad händer om du kan förutsäga användarnas beteende med smarta användargränssnitt? Med sannolikhetsstyrda statecharts, beslutsträd (decision trees), förstärkt inlärning (reinforcement learning) och mer, kan UI:s (User Interfaces) utvecklas på ett sådant sätt att de automatiskt anpassar sig till användarens beteende.

I filmklippet nedan kommer du få se hur du kan skapa anpassningsbara och intelligenta användargränssnitt som lär sig hur individuella användare använder dina appar och anpassar gränssnittet och funktionerna just för dem i realtid.

Mind Reading with Intelligent & Adaptive UIs (23:11)

Model driven development

Programmera biologiska celler – nästa mjukvarurevolution

The next software revolution – programming biological cells

00:04
The second half of the last century was completely defined by a technological revolution: the software revolution. The ability to program electrons on a material called silicon made possible technologies, companies and industries that were at one point unimaginable to many of us, but which have now fundamentally changed the way the world works. The first half of this century, though, is going to be transformed by a new software revolution: the living software revolution. And this will be powered by the ability to program biochemistry on a material called biology. And doing so will enable us to harness the properties of biology to generate new kinds of therapies, to repair damaged tissue, to reprogram faulty cells or even build programmable operating systems out of biochemistry. If we can realize this — and we do need to realize it — its impact will be so enormous that it will make the first software revolution pale in comparison.

01:11
And that’s because living software would transform the entirety of medicine, agriculture and energy, and these are sectors that dwarf those dominated by IT. Imagine programmable plants that fix nitrogen more effectively or resist emerging fungal pathogens, or even programming crops to be perennial rather than annual so you could double your crop yields each year. That would transform agriculture and how we’ll keep our growing and global population fed. Or imagine programmable immunity, designing and harnessing molecular devices that guide your immune system to detect, eradicate or even prevent disease. This would transform medicine and how we’ll keep our growing and aging population healthy.

01:59
We already have many of the tools that will make living software a reality. We can precisely edit genes with CRISPR. We can rewrite the genetic code one base at a time. We can even build functioning synthetic circuits out of DNA. But figuring out how and when to wield these tools is still a process of trial and error. It needs deep expertise, years of specialization. And experimental protocols are difficult to discover and all too often, difficult to reproduce. And, you know, we have a tendency in biology to focus a lot on the parts, but we all know that something like flying wouldn’t be understood by only studying feathers. So programming biology is not yet as simple as programming your computer. And then to make matters worse, living systems largely bear no resemblance to the engineered systems that you and I program every day. In contrast to engineered systems, living systems self-generate, they self-organize, they operate at molecular scales. And these molecular-level interactions lead generally to robust macro-scale output. They can even self-repair.

03:07
Consider, for example, the humble household plant, like that one sat on your mantelpiece at home that you keep forgetting to water. Every day, despite your neglect, that plant has to wake up and figure out how to allocate its resources. Will it grow, photosynthesize, produce seeds, or flower? And that’s a decision that has to be made at the level of the whole organism. But a plant doesn’t have a brain to figure all of that out. It has to make do with the cells on its leaves. They have to respond to the environment and make the decisions that affect the whole plant. So somehow there must be a program running inside these cells, a program that responds to input signals and cues and shapes what that cell will do. And then those programs must operate in a distributed way across individual cells, so that they can coordinate and that plant can grow and flourish.

03:59
If we could understand these biological programs, if we could understand biological computation, it would transform our ability to understand how and why cells do what they do. Because, if we understood these programs, we could debug them when things go wrong. Or we could learn from them how to design the kind of synthetic circuits that truly exploit the computational power of biochemistry.

04:25
My passion about this idea led me to a career in research at the interface of maths, computer science and biology. And in my work, I focus on the concept of biology as computation. And that means asking what do cells compute, and how can we uncover these biological programs? And I started to ask these questions together with some brilliant collaborators at Microsoft Research and the University of Cambridge, where together we wanted to understand the biological program running inside a unique type of cell: an embryonic stem cell. These cells are unique because they’re totally naïve. They can become anything they want: a brain cell, a heart cell, a bone cell, a lung cell, any adult cell type. This naïvety, it sets them apart, but it also ignited the imagination of the scientific community, who realized, if we could tap into that potential, we would have a powerful tool for medicine. If we could figure out how these cells make the decision to become one cell type or another, we might be able to harness them to generate cells that we need to repair diseased or damaged tissue. But realizing that vision is not without its challenges, not least because these particular cells, they emerge just six days after conception. And then within a day or so, they’re gone. They have set off down the different paths that form all the structures and organs of your adult body.

05:51
But it turns out that cell fates are a lot more plastic than we might have imagined. About 13 years ago, some scientists showed something truly revolutionary. By inserting just a handful of genes into an adult cell, like one of your skin cells, you can transform that cell back to the naïve state. And it’s a process that’s actually known as ”reprogramming,” and it allows us to imagine a kind of stem cell utopia, the ability to take a sample of a patient’s own cells, transform them back to the naïve state and use those cells to make whatever that patient might need, whether it’s brain cells or heart cells.

06:30
But over the last decade or so, figuring out how to change cell fate, it’s still a process of trial and error. Even in cases where we’ve uncovered successful experimental protocols, they’re still inefficient, and we lack a fundamental understanding of how and why they work. If you figured out how to change a stem cell into a heart cell, that hasn’t got any way of telling you how to change a stem cell into a brain cell. So we wanted to understand the biological program running inside an embryonic stem cell, and understanding the computation performed by a living system starts with asking a devastatingly simple question: What is it that system actually has to do?

07:13
Now, computer science actually has a set of strategies for dealing with what it is the software and hardware are meant to do. When you write a program, you code a piece of software, you want that software to run correctly. You want performance, functionality. You want to prevent bugs. They can cost you a lot. So when a developer writes a program, they could write down a set of specifications. These are what your program should do. Maybe it should compare the size of two numbers or order numbers by increasing size. Technology exists that allows us automatically to check whether our specifications are satisfied, whether that program does what it should do. And so our idea was that in the same way, experimental observations, things we measure in the lab, they correspond to specifications of what the biological program should do.

08:02
So we just needed to figure out a way to encode this new type of specification. So let’s say you’ve been busy in the lab and you’ve been measuring your genes and you’ve found that if Gene A is active, then Gene B or Gene C seems to be active. We can write that observation down as a mathematical expression if we can use the language of logic: If A, then B or C. Now, this is a very simple example, OK. It’s just to illustrate the point. We can encode truly rich expressions that actually capture the behavior of multiple genes or proteins over time across multiple different experiments. And so by translating our observations into mathematical expression in this way, it becomes possible to test whether or not those observations can emerge from a program of genetic interactions.

08:55
And we developed a tool to do just this. We were able to use this tool to encode observations as mathematical expressions, and then that tool would allow us to uncover the genetic program that could explain them all. And we then apply this approach to uncover the genetic program running inside embryonic stem cells to see if we could understand how to induce that naïve state. And this tool was actually built on a solver that’s deployed routinely around the world for conventional software verification. So we started with a set of nearly 50 different specifications that we generated from experimental observations of embryonic stem cells. And by encoding these observations in this tool, we were able to uncover the first molecular program that could explain all of them.

09:43
Now, that’s kind of a feat in and of itself, right? Being able to reconcile all of these different observations is not the kind of thing you can do on the back of an envelope, even if you have a really big envelope. Because we’ve got this kind of understanding, we could go one step further. We could use this program to predict what this cell might do in conditions we hadn’t yet tested. We could probe the program in silico.

10:08
And so we did just that: we generated predictions that we tested in the lab, and we found that this program was highly predictive. It told us how we could accelerate progress back to the naïve state quickly and efficiently. It told us which genes to target to do that, which genes might even hinder that process. We even found the program predicted the order in which genes would switch on. So this approach really allowed us to uncover the dynamics of what the cells are doing.

10:39
What we’ve developed, it’s not a method that’s specific to stem cell biology. Rather, it allows us to make sense of the computation being carried out by the cell in the context of genetic interactions. So really, it’s just one building block. The field urgently needs to develop new approaches to understand biological computation more broadly and at different levels, from DNA right through to the flow of information between cells. Only this kind of transformative understanding will enable us to harness biology in ways that are predictable and reliable.

11:12
But to program biology, we will also need to develop the kinds of tools and languages that allow both experimentalists and computational scientists to design biological function and have those designs compile down to the machine code of the cell, its biochemistry, so that we could then build those structures. Now, that’s something akin to a living software compiler, and I’m proud to be part of a team at Microsoft that’s working to develop one. Though to say it’s a grand challenge is kind of an understatement, but if it’s realized, it would be the final bridge between software and wetware.

11:48
More broadly, though, programming biology is only going to be possible if we can transform the field into being truly interdisciplinary. It needs us to bridge the physical and the life sciences, and scientists from each of these disciplines need to be able to work together with common languages and to have shared scientific questions.

12:08
In the long term, it’s worth remembering that many of the giant software companies and the technology that you and I work with every day could hardly have been imagined at the time we first started programming on silicon microchips. And if we start now to think about the potential for technology enabled by computational biology, we’ll see some of the steps that we need to take along the way to make that a reality. Now, there is the sobering thought that this kind of technology could be open to misuse. If we’re willing to talk about the potential for programming immune cells, we should also be thinking about the potential of bacteria engineered to evade them. There might be people willing to do that. Now, one reassuring thought in this is that — well, less so for the scientists — is that biology is a fragile thing to work with. So programming biology is not going to be something you’ll be doing in your garden shed. But because we’re at the outset of this, we can move forward with our eyes wide open. We can ask the difficult questions up front, we can put in place the necessary safeguards and, as part of that, we’ll have to think about our ethics. We’ll have to think about putting bounds on the implementation of biological function. So as part of this, research in bioethics will have to be a priority. It can’t be relegated to second place in the excitement of scientific innovation.

13:26
But the ultimate prize, the ultimate destination on this journey, would be breakthrough applications and breakthrough industries in areas from agriculture and medicine to energy and materials and even computing itself. Imagine, one day we could be powering the planet sustainably on the ultimate green energy if we could mimic something that plants figured out millennia ago: how to harness the sun’s energy with an efficiency that is unparalleled by our current solar cells. If we understood that program of quantum interactions that allow plants to absorb sunlight so efficiently, we might be able to translate that into building synthetic DNA circuits that offer the material for better solar cells. There are teams and scientists working on the fundamentals of this right now, so perhaps if it got the right attention and the right investment, it could be realized in 10 or 15 years.

14:18
So we are at the beginning of a technological revolution. Understanding this ancient type of biological computation is the critical first step. And if we can realize this, we would enter in the era of an operating system that runs living software.

Mäta CO2 och VOC med ESP32

Känner du dig ibland trött under möten eller i skolan?
Har du ibland huvudvärk efter jobbet eller skolan?
Vill du ändra på det? Då kan det vara intressant för dig att mäta skadliga gaser i luften i din arbetsmiljö, vilka kan resultera i både trötthet och huvudvärk.

I filmklippet nedan används en ESP32 och två ESP8266 med sensorer för att bygga ett system som mäter luftkvaliteten. Sensorerna som används är: Winsen MH-Z19, Sensirion SGP30 och SCD30.
I denna video:

  • Fokusera på inomhusklimat
  • Fokusera på gaser där den främsta källan är människor
  • CO2:s påverkan på luftkvaliteten inomhus
  • Se förhållandet mellan CO2-sensorer och global uppvärmning
  • Använd ett annat sätt för att bedöma inomhusluften: VOC eller eCO2
  • Och vi kommer att bygga sensorer för att överföra värden till Grafana

How to measure CO2 and VOC with ESP Microprocessors. Which one is better? (21:12)



Machine Learning för webben

Med TensorFlow.js kan du snabbt och enkelt skapa webbapplikationer som använder Artificiell Intelligens (AI) och Machine Learning (ML) med ett fåtal rader JavaScript-kod.
Det finns en hel del färdigbyggda och förtränade ML-modeller med JavaScript API:er som du kan använda direkt för tillämpningar som t ex:
Image Classification
Image Segmentation
Object Detection
Pose Detection
Speech Commands
Text Classifications
Augmented Reality
Gesture-based interaction
Speech recognition
Accessible web apps
Sentiment analysis, abuse detection
Conversational AI
Web page optimization
m.m.

Machine Learning magic for your web application with TensorFlow.js (Chrome Dev Summit 2019) 8:30






Learn more: TensorFlow.js → https://goo.gle/2XLhMe0 Tensorflow.js Github → https://goo.gle/2DcgLCe#ChromeDevSummit All Sessions → https://goo.gle/CDS19

The values and principles of the Agile Manifesto

The Four Values of The Agile Manifesto

The Agile Manifesto is comprised of four foundational values and 12 supporting principles which lead the Agile approach to software development. Each Agile methodology applies the four values in different ways, but all of them rely on them to guide the development and delivery of high-quality, working software.

1. Individuals and Interactions Over Processes and Tools
The first value in the Agile Manifesto is “Individuals and interactions over processes and tools.” Valuing people more highly than processes or tools is easy to understand because it is the people who respond to business needs and drive the development process. If the process or the tools drive development, the team is less responsive to change and less likely to meet customer needs. Communication is an example of the difference between valuing individuals versus process. In the case of individuals, communication is fluid and happens when a need arises. In the case of process, communication is scheduled and requires specific content.

2. Working Software Over Comprehensive Documentation
Historically, enormous amounts of time were spent on documenting the product for development and ultimate delivery. Technical specifications, technical requirements, technical prospectus, interface design documents, test plans, documentation plans, and approvals required for each. The list was extensive and was a cause for the long delays in development. Agile does not eliminate documentation, but it streamlines it in a form that gives the developer what is needed to do the work without getting bogged down in minutiae. Agile documents requirements as user stories, which are sufficient for a software developer to begin the task of building a new function.
The Agile Manifesto values documentation, but it values working software more.

3. Customer Collaboration Over Contract Negotiation
Negotiation is the period when the customer and the product manager work out the details of a delivery, with points along the way where the details may be renegotiated. Collaboration is a different creature entirely. With development models such as Waterfall, customers negotiate the requirements for the product, often in great detail, prior to any work starting. This meant the customer was involved in the process of development before development began and after it was completed, but not during the process. The Agile Manifesto describes a customer who is engaged and collaborates throughout the development process, making. This makes it far easier for development to meet their needs of the customer. Agile methods may include the customer at intervals for periodic demos, but a project could just as easily have an end-user as a daily part of the team and attending all meetings, ensuring the product meets the business needs of the customer.

4. Responding to Change Over Following a Plan
Traditional software development regarded change as an expense, so it was to be avoided. The intention was to develop detailed, elaborate plans, with a defined set of features and with everything, generally, having as high a priority as everything else, and with a large number of many dependencies on delivering in a certain order so that the team can work on the next piece of the puzzle.

With Agile, the shortness of an iteration means priorities can be shifted from iteration to iteration and new features can be added into the next iteration. Agile’s view is that changes always improve a project; changes provide additional value.

Perhaps nothing illustrates Agile’s positive approach to change better than the concept of Method Tailoring, defined in An Agile Information Systems Development Method in use as: “A process or capability in which human agents determine a system development approach for a specific project situation through responsive changes in, and dynamic interplays between contexts, intentions, and method fragments.” Agile methodologies allow the Agile team to modify the process and make it fit the team rather than the other way around.

The Twelve Agile Manifesto Principles

The Twelve Principles are the guiding principles for the methodologies that are included under the title “The Agile Movement.” They describe a culture in which change is welcome, and the customer is the focus of the work. They also demonstrate the movement’s intent as described by Alistair Cockburn, one of the signatories to the Agile Manifesto, which is to bring development into alignment with business needs.

The twelve principles of agile development include:

  1. Customer satisfaction through early and continuous software delivery – Customers are happier when they receive working software at regular intervals, rather than waiting extended periods of time between releases.
  2. Accommodate changing requirements throughout the development process – The ability to avoid delays when a requirement or feature request changes.
  3. Frequent delivery of working software – Scrum accommodates this principle since the team operates in software sprints or iterations that ensure regular delivery of working software.
  4. Collaboration between the business stakeholders and developers throughout the project – Better decisions are made when the business and technical team are aligned.
  5. Support, trust, and motivate the people involved – Motivated teams are more likely to deliver their best work than unhappy teams.
  6. Enable face-to-face interactions – Communication is more successful when development teams are co-located.
  7. Working software is the primary measure of progress – Delivering functional software to the customer is the ultimate factor that measures progress.
  8. Agile processes to support a consistent development pace –Teams establish a repeatable and maintainable speed at which they can deliver working software, and they repeat it with each release.
  9. Attention to technical detail and design enhances agility – The right skills and good design ensures the team can maintain the pace, constantly improve the product, and sustain change.
  10. Simplicity – Develop just enough to get the job done for right now.
  11. Self-organizing teams encourage great architectures, requirements, and designs – Skilled and motivated team members who have decision-making power, take ownership, communicate regularly with other team members, and share ideas that deliver quality products.
  12. Regular reflections on how to become more effective – Self-improvement, process improvement, advancing skills, and techniques help team members work more efficiently.

The intention of Agile is to align development with business needs, and the success of Agile is apparent. Agile projects are customer focused and encourage customer guidance and participation. As a result, Agile has grown to be an overarching view of software development throughout the software industry and an industry all by itself.


Kent Beck
Mike Beedle
Arie van Bennekum
Alistair Cockburn
Ward Cunningham
Martin Fowler
James Grenning
Jim Highsmith
Andrew Hunt
Ron Jeffries
Jon Kern
Brian Marick
Robert C. Martin
Steve Mellor
Ken Schwaber
Jeff Sutherland
Dave Thomas

© 2001, the above authors