Table of Contents
[BlocksExtension]

connected bluetooth on serial () at baudrate ()

Description

Connected bluetooth on serial () at baudrate () is a stack block available in Dabble extension for Arduino Mega. The Oscilloscope module allows you to visualize the input and output signals of your hardware device on your smartphone. This block for setting the baud rate of the Bluetooth module that you have connected to Arduino Mega.

Input Parameters

  1. Select the serial channel from the drop-down list on which you can connect the Bluetooth module.
  2. Select the baud rate from the drop-down list.

evive Notes Icon

Note: This block is available only in Upload mode.

Example

In the given script we will visualize the voltage level across an analog pin on Mega. Mega Oscilliscope

Example

Learn the steps required to create a Humanoid dance sequence. This guide covers a Humanoid robot and refining the sequence to create an engaging and entertaining performance.

Introduction

A Humanoid dance sequence is a set of programmed instructions that allows a Humanoid robot to perform a dance routine. Typically, these sequences involve a combination of movements and actions performed by the robot in a coordinated manner, to create an entertaining and engaging dance performance.

The process typically involves the following steps:

  1. Define the dance moves
  2. Sequence the moves
  3. Program the robot
  4. Test and refine

Creating a Humanoid dance sequence involves a combination of creativity, technical skill, and attention to detail, and can result in an engaging and entertaining performance that showcases the capabilities of robotic technology.

Code

Logic

  1. Drag and drop set pin RHip () Lhip () RFooot () LFoot () RHand () LHand() block from the Humanoid extension – This block is used to set the pins of the robot to control its movement.
  2. Initialize Humanoid will be in the home position – This means that at the start of the program, the Humanoid robot will be in its default position.
  3. Drag and drop forever loop for a continuous loop – This is a programming construct that ensures that the code inside the loop is executed continuously.
  4. The first display will have some light and then it will play sound and do some action for a specific time and specific speed – This is not explicitly described in the given code, but it could refer to displaying some LED lights and playing some sound effects as the robot performs a specific action, which could involve movement in a certain direction with a particular speed.
  5. Then drag and drop the repeat block to repeat the block for a specific time – This block is used to repeat a particular action a specific number of times or for a particular period with do() action() times at () speed block.
  6. Then drag and drop different actions for a specific time with a specific speed – This could refer to performing a series of different movements or actions, each with a specific duration and speed with do() action() times at () speed block.

Output

Read More
This code example shows how to program a humanoid robot to perform a dance routine. Learn how to use the Humanoid object with specific pins to control robot movement.

Introduction

Humanoid is a class in a programming code that is used to control the movements of a humanoid robot. The code provides specific pins to control the robot’s movement, and it allows the robot to perform a series of actions such as dancing, flapping, moving forward, and other actions. We can learn how to program a humanoid robot to dance.

Code

sprite = Sprite('Tobi')
quarky=Quarky()
import time

humanoid = Humanoid(7,2,6,3,8,1)

quarky.showemotion("surprise")
humanoid.home()
humanoid.move("forward",1000,1)
quarky.playsound("QuarkyIntro")
humanoid.action("flapping",1000,1)
time.sleep(1)
humanoid.action("dance2",1000,1)
time.sleep(1)
humanoid.action("moonwalker",1000,1)
time.sleep(1)
humanoid.action("dance1",1000,1)
time.sleep(1)
humanoid.action("forward",1000,1)
time.sleep(1)
Humanoid.action("tiptoeswing",1000,1)
time.sleep(1)
Humanoid.action("swing",1000,1)
time.sleep(1)
Humanoid.action("flapping",1000,1)
time.sleep(1)
Humanoid.action("updown",1000,1)
Humanoid.home()

Logic

  1. This code is an example of how to program a humanoid robot to perform a dance routine.
  2. The first line code imports the time module for later use.
  3. Then we initialize the Humanoid object with specific pins to control the movement of the robot. The pins are passed as arguments to the Humanoid function.
  4. Furthermore, we involve the Quarky object showing emotion, the humanoid robot moving to its home position, and then moving forward for a specified duration and speed.
  5. Then, Quarky plays a sound, and the humanoid performs a series of different actions, each with specific duration and speed, including flapping, dancing, moving forward, and other actions.
  6. The time. sleep(1) functions are used to pause the program for one second between each action.
  7. Finally, the humanoid returns to its home position at the end of the program.

Output

Read More
Learn how to code logic for video input detection with this example code. You will be able to direct your own Mecanum easily by just showing signs through the camera input.

Introduction

A sign detector Mecanum robot is a robot that can recognize and interpret certain signs or signals, such as hand gestures or verbal commands, given by a human. The robot uses sensors, cameras, and machine learning algorithms to detect and understand the sign, and then performs a corresponding action based on the signal detected.

These robots are often used in manufacturing, healthcare, and customer service industries to assist with tasks that require human-like interaction and decision making.

Code

sprite = Sprite('Tobi')
quarky = Quarky()
import time
meca=Mecanum(1,2,7,8)
recocards = RecognitionCards()

recocards.video("on flipped")
recocards.enablebox()
recocards.setthreshold(0.6)

while True:
  recocards.analysecamera()
  sign = recocards.classname()
  sprite.say(sign + ' detected')
  if recocards.count() > 1:
    if 'Go' in sign:
      meca.runtimedrobot("forward",100,2)
    if 'Turn Left' in sign:
      meca.runtimedrobot("lateral left",100,2)
    if 'Turn Right' in sign:
      meca.runtimedrobot("lateral right",100,2)
    if 'U Turn' in sign:
      meca.runtimedrobot("backward",100,2)

Logic

  1. Firstly, the code sets up the stage camera to look for signs and detects and recognizes the signs showed on the camera.
  2. Next, the code starts a loop where the stage camera continuously checks for the signs.
  3. Finally, if the robot sees certain signs (like ‘Go’, ‘Turn Left’, ‘Turn Right’, or ‘U Turn’), it moves in a certain direction (forward, backward, left, or backward) based on the respective signs.
  4. This can help the Mecanum to manoeuvre through the terrain easily by just showing signs on the camera.

Final Output

Forward Motion:

Right-Left Motions:

Read More
Learn how to code logic for video input detection with this example block code. You will be able to direct your own Mecanum easily by just showing signs through the camera input.

Introduction

A sign detector Mecanum robot is a robot that can recognize and interpret certain signs or signals, such as hand gestures or verbal commands, given by a human. The robot uses sensors, cameras, and machine learning algorithms to detect and understand the sign, and then performs a corresponding action based on the signal detected.

These robots are often used in manufacturing, healthcare, and customer service industries to assist with tasks that require human-like interaction and decision making.

Code

Initialization:

Main Code

Logic

  1. Firstly, the code sets up the stage camera to look for signs and detects and recognizes the signs showed on the camera.
  2. Next, the code starts a loop where the stage camera continuously checks for the signs.
  3. Finally, if the robot sees certain signs (like ‘Go’, ‘Turn Left’, ‘Turn Right’, or ‘U Turn’), it moves in a certain direction (forward, backward, left, or backward) based on the respective signs.
  4. This can help the Mecanum to manoeuvre through the terrain easily by just showing signs on the camera.

Output

Forward Motion:

Right-Left Motions:

Read More
Learn how to program a quadruped robot to perform predefined actions using PictoBlox.

Introduction

In this project, we will explain how to run predefined actions for Quadruped. By the end of the tutorial, learners will have gained knowledge and practical experience in programming a quadruped robot and controlling its movements using PictoBlox.

Code

sprite = Sprite('Tobi')
quarky = Quarky()
import time

quad=Quadruped(4,1,8,5,3,2,76)

while True:
  quad.home()
  time.sleep(0.2)
  quad.action("dance1",1000,1)
  time.sleep(0.5/2)
  quad.action("updown1",1000,2)
  time.sleep(0.2)
  quad.action("front back",1000,1)
  time.sleep(0.2)
  quad.action("march in place",1000,1)
  time.sleep(0.2)
  quad.action("left hand wave",1000,1)
  time.sleep(0.2)
  quad.action("right hand wave",1000,1)
  time.sleep(0.2)
  quad.action("bodyshake4",1000,2)
  time.sleep(0.2)
  quad.action("dance5",1000,2)
  time.sleep(0.2)
  quad.action("creepy",1000,2)
  quad.home()

Logic

  1. This is a program that controls a quadruped robot. The robot has been programmed to perform various actions such as dance, wave, and march in place using action () code .
  2. The program uses a while loop to continuously repeat the actions in a sequence with small pauses in between each action.
  3. The time.sleep function is used to control the duration of each action and the pauses in between them.
  4. The quad.home() function is used to reset the robot’s position to its initial position after each sequence of actions.

Output

Read More
Learn how to program a quadruped robot to perform predefined actions using PictoBlox.

Introduction

In this project, we will explain how to run predefined actions for Quadruped. By the end of the tutorial, learners will have gained knowledge and practical experience in programming a quadruped robot and controlling its movements using PictoBlox.

Code

sprite = Sprite('Tobi')
quarky = Quarky()
import time

quad=Quadruped(4,1,8,5,3,2,76)

while True:
  quad.home()
  time.sleep(0.2)
  quad.action("dance1",1000,1)
  time.sleep(0.5/2)
  quad.action("updown1",1000,2)
  time.sleep(0.2)
  quad.action("front back",1000,1)
  time.sleep(0.2)
  quad.action("march in place",1000,1)
  time.sleep(0.2)
  quad.action("left hand wave",1000,1)
  time.sleep(0.2)
  quad.action("right hand wave",1000,1)
  time.sleep(0.2)
  quad.action("bodyshake4",1000,2)
  time.sleep(0.2)
  quad.action("dance5",1000,2)
  time.sleep(0.2)
  quad.action("creepy",1000,2)
  quad.home()

Logic

  1. The robot has been programmed to perform various actions such as dance, wave, and march in place using
  2. action() code.
  3. The program uses a while loop to continuously repeat the actions in a sequence with small pauses in between each action.
  4. The ‘time.sleep’ function is used to control the duration of each action and the pauses in between them using time.sleep() code.
  5. The quad.home() function is used to reset the robot’s position to its initial position after each sequence of actions.

Output

Read More
Learn how to use PictoBlox to control a Quarky robotic arm wirelessly using keyboard keys.

Learn How to Control a Quarky Robotic Arm Wirelessly

Are you looking for a way to make a robotic arm that can be controlled wirelessly? If so, you’ve come to the right place! This tutorial will teach you how to control a Quarky robotic arm using the Bluetooth communication extensions of Quarky and PictoBlox. With this technique, you’ll be able to provide precise control over its movement and actions from a remote location.

The robotic arm can be used for various tasks, such as picking up objects or manipulating tools. It can also be used for applications like manufacturing, medical, research, and exploration. This type of robotic arm will increase the productivity and safety of operations that would otherwise be too hazardous or inaccessible for humans.

Let’s get started learning how to create a wirelessly controlled robotic arm.

 

Project

In this project, we are making Quarky Robotic Arm to be controlled wirelessly using keyboard inputs. Following are the controls we will program:

  1. Right Arrow – Move 10mm in X axis
  2. Left Arrow – Move -10mm in X axis
  3. Up Arrow – Move 10mm in Z axis
  4. Down Arrow – Move -10mm in Z axis
  5. w Key – Move 10mm in Y axis
  6. s Key – Move -10mm in Y axis
  7. o Key – Open gripper
  8. c Key – Close gripper

Code

  1. Open the Pictoblox application.
  2. Select the Block Coding Environment.
  3. Click on the Robotic Arm extension available in the left corner.

Following is the code to implement the project:

  1. When the green flag is clicked the Robotic Arm initializes and goes to the home position. Then we set the gripper open and close angle for the project. A variable named time is created to control the speed of the gripper. Finally, a loop is implemented calling custom blocks for controlling the X axis, Y axis, Z axis, and gripper controls.
  2. X Axis Control block moves the gripper in the X-axis.
  3. Y Axis Control block moves the gripper in the Y-axis.
  4. Z Axis Control block moves the gripper in the Z-axis.
  5. Gripper Control block open or close the gripper on command.

Run the program to test the code.

Output

 

You can explore doing pick and place using the robotic arm.

Read More
Learn how to build a Machine Learning model which can identify the type of flower from the camera feed or images using PictoBlox.

Introduction

In this example project we are going to create a Machine Learning Model which can identify the type of flower from the camera feed or images.

 

Images Classifier in Machine Learning Environment

Image Classifier is an extension of the ML environment that allows users to classify images into different classes. This feature is available only in the desktop version of PictoBlox for Windows, macOS, or Linux. As part of the Image Classifier workflow, users can add classes, upload data, train the model, test the model, and export the model to the Block Coding Environment.

Let’s create the ML model.

Opening Image Classifier Workflow

Alert: The Machine Learning Environment for model creation is available in the only desktop version of PictoBlox for Windows, macOS, or Linux. It is not available in Web, Android, and iOS versions.

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as Block Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. You’ll be greeted with the following screen.
    Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Image Classifier” extension. Click the “Create Project” button to open the Image Classifier window.
  6. You shall see the Image Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Image Classifier

Class is the category in which the Machine Learning model classifies the images. Similar images are put in one class.

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Image Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using  by Uploading the files from the local folder or the Webcam.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the images, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

However, before training the model, there are a few hyperparameters that you should be aware of. Click on the “Advanced” tab to view them.


It’s a good idea to train a numeric classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

Note: These hyperparameters can affect the accuracy of your model to a great extent. Experiment with them to find what works best for your data.

Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The x-axis of the graph shows the epochs, and the y-axis represents the corresponding accuracy. The range of the accuracy is 0 to 1.


Other evaluating parameter we can see by clicking on Train Report

Here we can see confusion matrix and training accuracy of individual classes after training.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Code

The idea is simple, we’ll add image samples in the “Backdrops” column. We’ll keep cycling through the backdrops and keep classifying the image on the stage.

  1. Add testing images in the backdrop and delete default backdrop.
  2. Now, come back to the coding tab and select the Tobi sprite.
  3. We’ll start by adding a when flag clicked block from the Events palette.
  4. Add switch backdrop to () block from the Looks palette. Select any image.
  5. Add a forever block from the  Control palette.
  6. Inside the forever block add an analyze image from () block from the Machine Learning palette.
  7. Add two blocks of say () for () seconds from the Looks palette.
  8. Inside the say block add join () () block from operator palette.
  9. Inside the join block write statement at first empty place and at second empty place add identified class from the Machine Learning palette.
  10. Finally, add the next backdrop block from the Looks palette below the () bounding box block.

Final Result

 

 

Read More
Explore the capabilities of controlling robotic arms for efficient material handling.

 

Introduction

Automatic Robotic AMR stands for Automatic Mobile Robots for Autonomous Material Handling. It refers to a class of robots that are designed to autonomously transport and handle materials in various environments, such as warehouses, factories, hospitals, and distribution centers.

Code

sprite = Sprite('Tobi')

import time

roboticArm = RoboticArm(1,2,3,4,)

roboticArm.setgripperangle(90,145)
roboticArm.sethome()
while True:
	roboticArm.controlgripper("open")
	roboticArm.movexyzonebyone(0,200,15,1000)
	roboticArm.gotoinoneaxis(180,"Y",1000)
	roboticArm.controlgripper("close")
	roboticArm.gotoinoneaxis(150,"X",1000)
	roboticArm.controlgripper("open")
	roboticArm.sethome()
	roboticArm.controlgripper("close")
	time.sleep(1)
	roboticArm.controlgripper("open")
	roboticArm.movexyzonebyone(-60,160,10,1000)
	roboticArm.gotoinoneaxis(200,"Y",1000)
	roboticArm.gotoinoneaxis(15,"Z",1000)
	roboticArm.gripperaction("close")
	roboticArm.gotoinoneaxis(150,"X",1000)
	roboticArm.gripperaction("open")
	roboticArm.sethome()
	roboticArm.gripperaction("close")
	time.sleep(0.2)

Logic

  1. Open the Pictoblox application.
  2. Select the Python-based environment.
  3. The code starts by importing the required modules, including the “quarky” module and the “RoboticArm” class from the “expansion_addon” module. These modules provide the necessary functionalities for controlling the robotic arm.
  4. An instance of the RoboticArm class is created with four parameters (1, 2, 3, 4). These parameters likely represent some configuration values or settings for the robotic arm.
  5. The gripper angle of the robotic arm is set to 90 and 145 degrees using the setgripperangle() method.
  6. The sethome() method is used to set the home position() of the robotic arm.
  7. The code enters a while loop that runs indefinitely (until the program is manually terminated. Inside the loop, various actions are performed using the robotic arm.
  8. The gripper of the robotic arm is opened and closed using the gripperaction() method with the parameters “open” and “close” respectively.
  9. The robotic arm is instructed to move in the XY plane using the movexyzonebyone() method. The parameters provided are the initial and final X coordinates (0 and 200), the Y coordinate (15), and the speed (1000).
  10. The robotic arm is commanded to move to a specific position in a single axis (X, Y, or Z) using the gotoinoneaxis() method. The parameters include the target position, the axis, and the speed.
  11. The program includes a few additional time.sleep(0.2) statements, which introduce short delays before the program repeats the looP.
  12. Press Run to run the code.

Output

Read More
Learn to Interface DHT11 (Temperature and Humidity) sensor and measure Temperature and Humidity with Arduino

Understanding DHT 11 Sensor


DHT11 is a low-cost digital sensor for sensing temperature and humidity.  This sensor can be easily interfaced with any micro-controller such as Arduino, Raspberry Pi etc.  to measure humidity and temperature instantaneously. It comes with a dedicated NTC to measure temperature and an 8-bit microcontroller to output the values of temperature and humidity as serial data. The DHT11 sensor has three pins- VCC, GND and Data Pin. A pull-up resistor of 5k to 10k ohms is provided for communication between sensor and microcontroller.

Circuit Diagram

Code

  1. From  event add  when green flag clicked.
  2. Create two variable temp & humidity for storing the temperature and humidity respectively.
  3. From  control  add  forever
  4. From sensor palette of  Arduino  select get humidity and set it variable humidity do same for temperature
  5. From  look  palette  add  say ()  for() block.
  6. From operators add three join () ()  display temperature and humidity with some message.

Script

Output

 

Read More
Discover Quarky, a versatile microcontroller for programming various projects, from basic to advanced levels in robotics and AI.
Introduction

Quarky is a powerful microcontroller that allows for custom programming of projects, ranging from basic to advanced levels. With various built-in features such as sensors, actuators, and a speaker, Quarky becomes the perfect companion for those venturing into the world of robotics and AI. Its compact size and plug-and-play functionality make it an ideal choice for students eager to learn and experiment with robotics.

 

Circuit Diagram:

 

 

Code:

Follow these steps to implement the code using Pictoblox for Quarky to control the LED based on the IR sensor’s readings:

  1. Open Pictoblox and create a new file in block coding.
  2. Go to boards and select Quarky.
  3. Add an if-then-else block from the event palette.
  4. From the operators, add the “less than” operator in the conditional space.
  5. Go to Quarky and add the “read analog pin ()” block into the first space of the “less than” operator. Change the value to 500.
  6. Use the “set digital pin () as ()” block from Quarky to turn ON the LED connected at D1 if the value is less than 500.
  7. If the value is above the set value (500), then the LED must turn OFF.
  8. Place the above set of code in the “forever” loop.
  9. Now add “when flag clicked” at the start in the script.

Script:

Output:

In this comprehensive introduction, you have learned about Quarky, the versatile microcontroller, and its potential in robotics and AI projects. Explore its various features, sensors, and plug-and-play functionality. Follow our step-by-step guide to set up the circuit with the IR sensor and LED, and program Quarky using Pictoblox’s block coding. Witness the successful implementation through the final script and output, experiencing the magic of Quarky in action!

 

Read More
Line-following using Quarky's default IR sensors without PID control, relying on basic input-response for simpler path tracking. Great for understanding fundamental robotics concepts!

Steps :

1.  Set IR threshold for stopping the robot at the crossing lines
2. Use a block to get IR value for reading IR value
3. The below example uses a pid line following blocks
4. When you click doline following robot start line following and stop at the check-point(when both IRs are at the Black line)

Script

Output

Read More
In this example, we will retrieve the color information from the cloud and make the Quarky lights ON and OFF with the selected color on the cloud.

In this example, we will retrieve the color information from the cloud and make the Quarky lights ON and OFF with the selected color on the cloud.

Adafruit IO Settings

  1. Create two new Feeds named Light and Color.
  2. Create a new Dashboard named RGB Light.
  3. Edit the Dashboard and add a Toggle Block.
  4. Connect the Light feed to the block and click on Next Step.
  5. Edit the Block Setting and click on Create Block.
  6. Block is added. You can try to toggle the switch.
  7. Next, create another block. This time select Color Picker.
  8. Connect to Color Feed.
  9. You will find the blocks on the Dashboard. Edit the layout as per your liking.

Script

The following script reads the Light feed value and makes the Quarky light up accordingly.

 

Output

Read More
Learn how to create a dance sequence with Quadruped and Music using PictoBlox in this tutorial. With this tutorial.

Introduction

The example demonstrates how to create a dance sequence with Quadruped with Music.

Code for Stage Mode

When Quarky‘s left pushbutton is pressed, it will wait for 1 second and play a sound. After that, Quadruped will wave its right hand and move back to thehome position. Then Quarky will play different tones and Quadruped will do different actions. Finally, Quadruped will move back to thehome position.

Code for Upload Mode

You can also make the Quadruped dance work independent of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

We will make the Quadruped start the dance sequence when the Left key is pressed.

Output

Explore: Try to create your dance sequence.
Read More
Learn how to create a light switch in Adafruit IO with Python code. Step-by-step guide on how to connect Quarky to Adafruit IO and create a dashboard with a toggle block.

In this example, we are going to understand how to make a feed on Adafruit IO. Later we will write the Python code that will retrieve the information from the cloud and make the Quarky lights ON and OFF.

Adafruit IO Key

You can find the information about your account once you log in from here:

Note: Make sure you are login on Adafruit IO: https://io.adafruit.com/

Creating a Light Switch in Adafruit IO

  1. Create a new Feed named Light.
  2. Create a new Dashboard named Light Control.
  3. Edit the Dashboard and add a Toggle Block.
  4. Connect the Light feed to the block and click on Next Step.
  5. Edit the Block Setting and click on Create Block.
  6. Block is added. You can try to toggle the switch.
  7. Go to the Light feed. You will observe the value of the feed changing as we click on the switch on the Dashboard.

Python Code for Stage Mode

This Python code creates two instances, one of the Quarky class and one of the AdaIO class. It sets the brightness of the Quarky instance to 10 and connects to Adafruit IO using the specified username and key. It then creates a loop that checks to see if the value of theLight feed from Adafruit IO isON“. If the data is “ON” then the white light is turned on on the display, otherwise, the display is cleared.

# Create a new instance of the Quarky class and call it quarky
quarky = Quarky()

# Create a new instance of the AdaIO class and call it adaio
adaio = AdaIO()

# Set the brightness of Quarky to 10
quarky.setbrightness(10)

# Connect to Adafruit IO using the specified username and key
adaio.connecttoadafruitio("STEMNerd", "aio_UZBB56f7VTIDWyIyHX1BCEO1kWEd")

# Create an loop
while True:

  # Check to see if the value of the "Light" feed from Adafruit IO is "ON"
  if (adaio.getdata("Light") == "ON"):
    # If the data is "ON" draw the specified pattern
    quarky.drawpattern("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa")

  # Otherwise, clear the display
  else:
    quarky.cleardisplay()

Output

Python Code for Upload Mode

You can also make the script work independently of PictoBlox using the Upload Mode. For that switch to upload mode.

Note:  In the upload mode your Quarky needs to connect to the WiFi network. You can learn how to do it here: https://ai.thestempedia.com/example/wi-fi-connect-for-quarky-in-upload-mode-python/
# This code connects to a wifi network and an adafruit IO account.
# It also uses the module "quarky" to set the brightness and draw a pattern.
# The code will check if the wifi is connected and if the value of "Light" in the adafruit IO account is "ON",
# it will draw a pattern on the display. If not, it will clear the display.
# If the wifi is not connected, it will set LED 1 to red.

from quarky import *

# imported module
import iot

# Connect to a wifi network
wifi = iot.wifi()
wifi.connecttowifi("IoT", "12345678")

# Connect to an adafruit IO account
adaio = iot.AdaIO()
adaio.connecttoadafruitio("STEMNerd", "aio_UZBB56f7VTIDWyIyHX1BCEO1kWEd")

# Set the brightness
quarky.setbrightness(10)

while True:
  # Check if the wifi is connected
  if wifi.iswificonnected():
    # Check if the value of "Light" in the adafruit IO account is "ON"
    if (adaio.getdata("Light") == "ON"):
      # Draw a while LED pattern on the display
      quarky.drawpattern("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa")
    else:
      # Clear the display
      quarky.cleardisplay()
  else:
    # Set LED 1 to red
    quarky.setled(1, 1, [255, 0, 0], 100)

Troubleshooting:

  1. If the Green Light comes, your Wi-Fi is connected.
  2. If the Red Light comes, your Wi-Fi is not connected. Change the Wi-Fi and the password and try again.
  3. If the Red Cross sign comes, Quarky, Python error has occurred. Check the serial monitor and try to reset the Quarky.
Read More
Learn how to code the Mars Rover to turn left and right on a circle with the set () to () block. Try different left and right orientations and move the Mars Rover with the up and down keys.

Introduction

In the last project, we looked at the Mars Rover control for turning left and right.

Instead of rotating the Mars Rover at a place to turn left or right, you can alternatives make the Mars Rover move in a circle.

  1. Turning left on a circle:
  2. Turning right on a circle:

This can be executed with the set () to () block. You have 2 other options Left and Right.

Left Orientation

Right Orientation

Coding Steps

The following code sets the servo motor position to the left, straight, and right when the a, s, and d keys are pressed. Then to move the Mars Rover, the code checks if the up and down key is pressed.

Make the code and play with the Mars Rover.

Output

Circular Right-Left Motion

Read More
Learn about AI-based face expression detection, computer vision techniques to analyze images or videos of human faces and recognize emotions or expressions.

Introduction

AI-based face expression detection refers to the use of artificial intelligence algorithms and computer vision techniques to analyze images or videos of human faces and recognize the emotions or expressions being displayed. The technology can detect and analyze subtle changes in facial features, such as eye movement, mouth shape, and eyebrow position, to determine whether a person is happy, sad, angry, surprised, or expressing other emotions.

Discover the various fields that utilize this technology, including psychology, marketing, and human-computer interaction. Additionally, read about the logic and code behind face detection with a camera feed, including the initialization of parameters, face detection library, loop execution, and if-else conditions. Explore how the technology continuously analyzes emotions, and how the Humanoid responds with different facial expressions and movements.

Code

sprite = Sprite('Tobi')
fd = FaceDetection()
quarky = Quarky()
import time

humanoid = Humanoid(7, 2, 6, 3, 8, 1)

# Turn the video ON with 0% transparency
fd.video("ON", 0)
fd.enablebox()

# Run this script forever
while 1:
  fd.analysecamera()          # Analyse image from camera 
  sprite.say(fd.expression()) # Say the face expressions
  
  if fd.isexpression(1, "happy"): # if face expression is happy
    quarky.showemotion("happy")   # show happy emotion on Quarky
    humanoid.action("dance2", 1000, 1)
    
  if fd.isexpression(1, 'sad'):
    quarky.showemotion("crying")
    humanoid.action("updown", 1000, 1)
    
  if fd.isexpression(1, 'surprise'):
    quarky.showemotion('surprise')
    humanoid.action("moonwalker", 1000, 1)
    
  if fd.isexpression(1, 'angry'):
    quarky.showemotion('angry')    
    humanoid.action("flapping2", 1000, 1)
  else:
    humanoid.home()
    
# Comment the above script, uncomment the below script and 
# run this script to clear the stage and quarky display

fd.disablebox()
fd.video("off")    
quarky.cleardisplay()

Logic

The example demonstrates how to use face detection with a camera feed. Following are the key steps happening:

  1. Creates a sprite object named ‘Tobi’. A sprite is typically a graphical element that can be animated or displayed on a screen.also creates a Quarky object.
  2. Creates a face detection object named ‘fd’. This object is responsible for detecting faces in images or video using fd = FaceDetection()
  3. Imports the ‘time’ module, which provides functions to work with time-related operations using import time.
  4.  Creates a humanoid object with specific pins assigned to control various actions of the humanoid robot.
  5.  Turns on the video display with 0% transparency for the face detection module using fd.video(“ON”, 0).
  6.  Enables the face detection module to draw boxes around detected faces using fd.enablebox().
  7. The code enters an infinite loop using while 1, which means it will keep running indefinitely until interrupted.
  8. Analyzes the image from the camera for face detection using fd.analysecamera().
  9. The sprite says the detected face expressions obtained from the face detection module using sprite.say(fd.ex * pression()).
  10. The code checks for different face expressions using if statements and performs corresponding actions.
  11. For example, if the face expression is determined to be “happy“, the Quarky device shows a “happy” emotion, and the humanoid performs a dance action.
  12. Similarly, other face expressions like “sad”, “surprised”, and “angry” trigger specific emotional displays on Quarky and corresponding actions on the humanoid.
  13. If none of the predefined face expressions match, the humanoid goes back to its default or “home” position.

Output

Read More
Incorporate a fun activity into your artificial intelligence learning journey by using Humanoid robots to learn about face detection.

Introduction

As we start learning artificial intelligence, let’s make it more engaging by incorporating a fun activity. One of the most popular topics in AI is face detection, and we can make it even more exciting by learning it with the help of Humanoid robots. Are you interested in learning it together?

Code

Logic

  1. Simply drag and drop the RHip(), LHip(), RFoot(), LFoot(), RHand(), LHand() block from the Humanoid extension.
  2. Start the program by initializing the sprite and face detection library parameters.
  3. Use the forever loop block to create a continuous loop.
  4. If the camera detects more than one face, the Humanoid will move forward with a specific time, speed, and dance move with do() motion() times at () speed() block.
  5. If no face is detected, the Humanoid will move backward at a specific time and speed using do() motion() times at () speed() block.

Output

Read More
Learn how to code logic for speech recognized control of Mecanum with this example block code. You will be able to direct your own Mecanum easily by just speaking commands.

Learn how to code logic for speech recognized control of Mecanum with this example block code. You will be able to direct your own Mecanum easily by just speaking commands.

Introduction

A speech recognized controlled Mecanum robot is a robot that can recognize and interpret our speech, verbal commands, given by a human. The code uses the speech recognition model that will be able to record and analyze your speech given and react accordingly on the Mecanum.

Speech recognition robots can be used in manufacturing and other industrial settings to control machinery, perform quality control checks, and monitor equipment.

They are also used to help patients with disabilities to communicate with their caregivers, or to provide medication reminders and other health-related information.

Code

sprite=Sprite('Tobi')
import time
meca=Mecanum(1,2,7,8)
quarky = Quarky()
sr = SpeechRecognition()
ts = TexttoSpeech()
sr.analysespeech(4, "en-US")
command = sr.speechresult()
command = command.lower()
if 'forward' in command:
  meca.runtimedrobot("forward",100,2)
elif 'back' in command:
  meca.runtimedrobot("backward",100,2)
elif 'right' in command:
  meca.runtimedrobot("lateral right",100,2)
elif 'left' in command:
  meca.runtimedrobot("lateral left",100,2)

time.sleep(10)
sprite.stopallsounds()

Logic

  1. Firstly, the code initializes the Mecanum pins and starts recording the microphone of the device to store the audio command of the user.
  2. The code then checks conditions whether the command included the word “Go” or not. You can use customized commands and test for different conditions on your own.
  3. If the first condition stands false, the code again checks for different keywords that are included in the command.
  4. When any condition stands true, the robot will align itself accordingly and move in that direction of the respective command.

Output

Read More
Learn how to code logic for speech recognized control of Mecanum with this example block code. You will be able to direct your own Mecanum easily by just speaking commands.

Introduction

A speech recognized controlled Mecanum robot is a robot that can recognize and interpret our speech, verbal commands, given by a human. The code uses the speech recognition model that will be able to record and analyze your speech given and react accordingly on the Mecanum.

Speech recognition robots can be used in manufacturing and other industrial settings to control machinery, perform quality control checks, and monitor equipment.

They are also used to help patients with disabilities to communicate with their caregivers, or to provide medication reminders and other health-related information.

Main Code:

Logic

  1. Firstly, the code initializes the Mecanum pins and starts recording the microphone of the device to store the audio command of the user.
  2. The code then checks conditions whether the command included the word “Forward” or not. You can use customized commands and test for different conditions on your own.
  3. If the first condition stands false, the code again checks for different keywords that are included in the command.
  4. When any condition stands true, the robot will align itself accordingly and move in that direction of the respective command.

Output

Read More
Learn how to create a dance sequence with Quadruped and Music using PictoBlox in this tutorial.

Introduction

The example demonstrates how to create a dance sequence with Quadruped with Music.

Code for Stage Mode

  1. When Quarky‘s left pushbutton is pressed, it will wait for 1 second and play a sound. 
  2. After that, Quadruped will wave its right hand and move back to the “home position. Then 
  3. Quarky will play different tones and Quadruped will do different actions. 
  4. Finally, Quadruped will move back to the “home position.
sprite = Sprite('Tobi')
quarky = Quarky()
import time

quad=Quadruped(4,1,8,5,3,2,7,6)


quad.home()
while True
	quarky.playsound("QuarkyIntro")
	quad.action("right hand wave",700,2)
	quad.home()
	quarky.playtone("E5",8)
	quad.action("front back",700,3)
	quarky.playtone("C4",8)
	quad.action("bodyshake2",700,3)
	quarky.playtone("D4",8)
	quad.action("bodyshake3",700,3)
	quarky.playtone("E4",8)
	quad.action("bodyshake4",700,3)
	quarky.playtone("C5",8)
	quad.action("updown1",700,3)
	quad.home()

Code for Upload Mode

You can also make the Quadruped dance work independent of PictoBlox using the Upload Mode. For that switch to upload mode. 

Output

Explore: Try to create your dance sequence.
Read More
Convert any word or phrase into a delightful sequence of emojis with our Emoji Converter.

Introduction

Are you looking to add some fun and expressiveness to your conversations? Look no further! I’m here to help you convert any word or phrase into a colorful array of emojis. Whether you want to spice up your messages, or social media posts, or simply bring a smile to someone’s face, I’ve got you covered.

Just type in the word or phrase you want to transform, and I’ll generate a delightful sequence of emojis that capture the essence of your text. Emojis are a universal language that transcends words from happy faces to animals, objects, and everything in between.

So, let’s get started and infuse your text with a touch of emoji magic! 🎉🔥

Logic

This code allows the user to interact with the sprite and provide emojis, which are then transformed into a response using the ChatGPT model. The sprite then speaks the generated response using the provided emojis.

  1. Open PictoBlox and create a new file.
  2. Choose a suitable coding environment for python-based coding.
  3. Define a sprite , Tobi.
  4. Then, we create an instance of the ChatGPT model using the ChatGPT class.
  5. The sprite asks the user to input the emojis they want to use by calling the input method.
  6. The sprite uses its answer method to get the user’s response, which is then converted to a string using str().
  7. The movieToemoji method of the ChatGPT model converts the user’s response into emojis.
  8. The chatGPTresult method retrieves the result of the ChatGPT conversation.
  9. Finally, the sprite says the result for 5 seconds using the said method.

Code

sprite = Sprite('Tobi')
gpt = ChatGPT()

sprite.input("Please let me know which emojis you'd like me to use by typing them here.")
answer= str(sprite.answer())

gpt.movieToemoji(answer)
result=gpt.chatGPTresult()

sprite.say(result,5)

Output

Read More
Convert any word or phrase into a delightful sequence of emojis with our Emoji Converter.

Introduction

Are you looking to add some fun and expressiveness to your conversations? Look no further! I’m here to help you convert any word or phrase into a colorful array of emojis. Whether you want to spice up your messages, or social media posts, or simply bring a smile to someone’s face, I’ve got you covered.

Just type in the word or phrase you want to transform, and I’ll generate a delightful sequence of emojis that capture the essence of your text. Emojis are a universal language that transcends words from happy faces to animals, objects, and everything in between.

So, let’s get started and infuse your text with a touch of emoji magic! 🎉🔥

Logic

This code allows the user to interact with the sprite and provide emojis, which are then transformed into a response using the ChatGPT model. The sprite then speaks the generated response using the provided emojis.

  1. Open PictoBlox and create a new file.
  2. Choose a suitable coding environment for Block-based coding.
  3. Define a sprite , Tobi.
  4. Then, we create an instance of the ChatGPT model using the ChatGPT class.
  5. The sprite will ask you which word you want to convert into emojis.
  6. ChatGPT will respond based on its using getAIresponce block.
  7. The sprite will display the chosen word in emojis.
  8. Press Green Flag to Run the code.

Code

Output

Read More
Discover how gesture-controlled robotic arms revolutionize robotics with intuitive control. Learn about their applications in manufacturing, healthcare, and virtual reality.

Introduction

A gesture-controlled robotic arm is a robotic arm that can be controlled using hand or body movements instead of traditional buttons or joysticks. It uses sensors and algorithms to interpret the gestures made by a user and translates them into commands for the robotic arm.

The user wears or holds a device with sensors, such as a glove or wristband, that captures their hand movements or body gestures. These movements are processed by a computer or microcontroller, which analyzes them and recognizes specific gestures using algorithms and machine learning techniques.

Once the gestures are recognized, the system generates commands for the robotic arm to move accordingly. The arm can have multiple joints and degrees of freedom to perform complex movements. The user’s gestures are mimicked by the robotic arm, allowing them to control its actions.

Gesture-controlled robotic arms are used in various fields, including manufacturing, healthcare, and virtual reality. They provide a more intuitive and natural way of controlling robotic systems, eliminating the need for complex input devices and extensive training.

Hand Gesture Classifier Workflow

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as appropriate Coding Environment. you can click on “Machine Learning Environment” to open it.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: The name to which the class will be referred.
  2. Hand Pose Data: This data can be taken from the webcam or uploaded from local storage.

Note: You can add more classes to the projects using the Add Class button.
Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.
Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to predict previously unseen data.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Logic

The robotic arm will move according to the following logic:

  1. When the left gesture is detected – the robotic arm will move in the anti-clockwise direction.
  2. When the right gesture is detected – the robotic arm will move in a Clockwise direction.

Code

Logic

  1. First we Initialize Robitc Arm extension.
  2. Then, we open the recognition window, which will identify different poses, and turn on the camera with a certain level of transparency to identify images from the stage.
  3. If the identified class is “left,” the Robotic arm will move in anti-clockwise direction using move in () circle of center X() Z(),radius() & along Y() in ()ms block.
  4. If the identified class is “right,” the Robotic Arm will move right using move in () circle of center X() Z(),radius() & along Y() in ()ms block
  5. Press Run to run the code.

Output

Read More
Learn how to build a Machine Learning model which can identify the type of waste from the camera feed or images using PictoBlox.

Introduction

In this example project we are going to create a Machine Learning Model which can identify the type of waste from the camera feed or images.

Images Classifier in Machine Learning Environment

Image Classifier is an extension of the ML environment that allows users to classify images into different classes. This feature is available only in the desktop version of PictoBlox for Windows, macOS, or Linux. As part of the Image Classifier workflow, users can add classes, upload data, train the model, test the model, and export the model to the Block Coding Environment.

Let’s create the ML model.

Opening Image Classifier Workflow

Alert: The Machine Learning Environment for model creation is available in the only desktop version of PictoBlox for Windows, macOS, or Linux. It is not available in Web, Android, and iOS versions.

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as Block Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. You’ll be greeted with the following screen.
    Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Image Classifier” extension. Click the “Create Project” button to open the Image Classifier window.
  6. You shall see the Image Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Image Classifier

Class is the category in which the Machine Learning model classifies the images. Similar images are put in one class.

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Image Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using by Uploading the files from the local folder or the Webcam.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the images, which in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

However, before training the model, there are a few hyperparameters that you should be aware of. Click on the “Advanced” tab to view them.


It’s a good idea to train an image classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

Note: These hyperparameters can affect the accuracy of your model to a great extent. Experiment with them to find what works best for your data.

Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The x-axis of the graph shows the epochs, and the y-axis represents the corresponding accuracy. The range of accuracy is 0 to 1.


Other evaluating parameters can see by clicking on Train Report

Here we can see the confusion matrix and training accuracy of individual classes after training.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Code

The idea is simple, we’ll add image samples in the “Backdrops” column. We’ll keep cycling through the backdrops and keep classifying the image on the stage.

  1. Add testing images in the backdrop and delete the default backdrop.
  2. Now, come back to the coding tab and select the Tobi sprite.
  3. We’ll start by adding a when flag clicked block from the Events palette.
  4. Add switch backdrop to () block from the Looks palette. Select any image.
  5. Add a forever block from the  Control palette.
  6. Inside the forever block add an analyze image from () block from the Machine Learning palette.
  7. Add two blocks of say () for () seconds from the Looks palette.
  8. Inside the say block add join () () block from operator palette.
  9. Inside the join block write statement at first empty place and at second empty place add identified class from the Machine Learning palette.
  10. Finally, add the next backdrop block from the Looks palette below the () bounding box block.

Final Result

You can build more applications on top of this waste classifier.

Read More
Learn about LDR (Light Dependent Resistor) sensor and its analog properties while creating a fun activity using Quarky.
Introduction

The LDR (Light Dependent Resistor) sensor is a renowned analog-type sensor that adapts its resistance according to surrounding light intensity. With its ability to fluctuate resistance based on light changes, the LDR plays a vital role in various light-sensing applications. Though typically designed as a two-pin sensor, it is also available as a three-pin module, offering enhanced features and versatility.

In this example, we embark on a fun activity using the LDR sensor and Quarky. Through this engaging experience, you will grasp essential concepts like analog and PWM signals, creating an exciting learning journey. So, let’s dive in and explore the wonders of LDR and Quarky together!

Circuit Diagram:

Code:

Follow these steps to implement the code using Pictoblox for Quarky and explore the behavior of the LDR sensor:

  1. Open Pictoblox and create a new file.
  2. From the board menu, select Quarky and connect it to Pictoblox.
  3. Add the “when flag clicked” event block into the scripting area.
  4. Create two variables named “brightness” and “LDR value” from the “my variables” category and set them both to 0.
  5. Add a forever loop to ensure continuous execution of the code.
  6. From the sensor palette of Quarky, drag the block read analog sensor () at pin () and the LDR value equal to the value of this block as shown below.
  7. Map this value to change the range from 0-4095 to 0-255 (0V-5V) and store the result in the “brightness” variable.
  8. Go to the Quarky palette and drag the “set PWM pin() output as()” block. Select D1 from the drop-down menu.
  9. Set the PWM value as the “brightness” variable.

 

With these steps, your script is complete, and Quarky is ready to interact with the LDR sensor.

Output:

Through this exciting project, you have learned about the LDR sensor, its analog characteristics, and how Quarky can control an LED based on the light intensity sensed by the LDR. Delve deeper into the concepts of analog and PWM signals, making your robotics journey even more captivating with Quarky! Stay curious and keep exploring the endless possibilities!

Read More
Line-following using external IR sensors and Quarky with PID control, featuring adaptive feedback for precise and efficient path tracking. Perfect for advanced robotics learning!

Steps:

  1. Connect the External IR module analog pin with the quarky analog pin and select the pin on the block
  2. Set IR threshold for line following and stopping the robot at the crossing line
  3. Use read analog pin block for reading IR value to get IR threshold
  4. The below example uses without-PID line following blocks
  5. When you click doline following robot start line following and stop at the check-point (when both IRs are at the Black line)

Script

 

Read More
The project demonstrates how to create a smart plug that can be controlled by an IoT device and that can retrieve information from the cloud. The smart plug can be used to turn lights ON and OFF.

The project demonstrates how to create a smart plug that can be controlled by an IoT device and that can retrieve information from the cloud. The smart plug can be used to turn lights ON and OFF.

Adafruit IO Settings

We will be using Adafruit IO for creating a switch on the cloud. Follow the instructions:

  1. Create a new Feed named Light.
  2. Create a new Dashboard named Light Control.
  3. Edit the Dashboard and add a Toggle Block.
  4. Connect the Light feed to the block and click on Next Step.
  5. Edit the Block Setting and click on Create Block.
  6. Block is added. You can try to toggle the switch.
  7. Go to the Light feed. You will observe the value of the feed changing as we click on the switch on the Dashboard.

Circuit

The bulb is connected to the smart plug which is controlled with a relay.

Note:  A relay is an electromechanical switch that is used to turn on or turn off a circuit by using a small amount of power to operate an electromagnet to open or close a switch.

If the relay is ON, the smart switch gets ON, turning on the light. The relay has the following connections:

  1. GND Pin connected to GND of the Quarky Expansion Board.
  2. VCC Pin connected to VCC of the Quarky Expansion Board.
  3. Signal Pin connected to Servo 4 of the Quarky Expansion Board.

Code

The logic is the following – We’ll connect to the Adafruit IO account, fetch the switch state from the cloud, and use and if block to turn the relay on or off accordingly.

Output

IoT Enabled Smart Plug Upload Mode

You can also make the IoT Enabled Smart Plug work independently of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

You can download the code from here: IoT enabled Smart Plug – Upload Mode

Read More
Learn how to create a Color Switch on Adafruit IO and the Python code needed to retrieve the color information from the cloud and make the Quarky lights ON and OFF with the selected color.

In this example, we are going to make a colored switch in Adafruit IO. Then we will make Python code for Quarky that will retrieve the color information from the cloud and make the Quarky lights ON and OFF with the selected color on the cloud.

Adafruit IO Key

You can find the information about your account once you log in from here:

 

Note: Make sure you are login on Adafruit IO: https://io.adafruit.com/

Creating the Dashboard for the Color Switch

  1. Create two new Feeds named Light and Color.
  2. Create a new Dashboard named RGB Light.
  3. Edit the Dashboard and add a Toggle Block.
  4. Connect the Light feed to the block and click on Next Step.
  5. Edit the Block Setting and click on Create Block.
  6. Block is added. You can try to toggle the switch.
  7. Next, create another block. This time select Color Picker.
  8. Connect to Color Feed.
  9. You will find the blocks on the Dashboard. Edit the layout as per your liking.

Python Code for Stage Mode

This code creates an instance of the Quarky and AdaIO classes and connects to Adafruit IO using credentials. It then creates a loop that checks if the light is on and, if so, sets the LED on the Quarky to the color from the Adafruit IO data. If the light is not on, the display on the Quarky is cleared. The loop pauses for 1 second before repeating.

# Create an instance of the quarky class
quarky = Quarky()

# Import the time library
import time

# Create an instance of the AdaIO class
adaio = AdaIO()

# Clear the display on the quarky
quarky.cleardisplay()

# Connect to Adafruit IO using the given credentials
adaio.connecttoadafruitio("STEMNerd", "aio_UZBB56f7VTIDWyIyHX1BCEO1kWEd")

# Loop forever
while True:
  # Get the data from Adafruit IO to see if the light is on
  if (adaio.getdata("Light") == "ON"):
    # Get the color data from Adafruit IO
    adaio.getcolordata("Color")
    # Get the RGB values from the color data
    Color = [adaio.getRGB(1), adaio.getRGB(2), adaio.getRGB(3)]
    
    # Loop through the 5 rows and 7 columns on the quarky
    for i in range(1, 6):
      for j in range(1, 8):
        # Set the LED on the quarky to the Color with a brightness of 20
        quarky.setled(j, i, Color, 20)

  # If the light is not on, clear the display on the quarky
  else:
    quarky.cleardisplay()

  # Pause the program for 1 second
  time.sleep(1)

 

Output

Python Code for Upload Mode

You can also make the script work independently of PictoBlox using the Upload Mode. For that switch to upload mode.

Note:  In the upload mode your Quarky needs to connect to the WiFi network. You can learn how to do it here: https://ai.thestempedia.com/example/wi-fi-connect-for-quarky-in-upload-mode-python/
# Create an instance of the quarky class
from quarky import *

# Import the library
import time
import iot

# Connect to a wifi network
wifi = iot.wifi()
wifi.connecttowifi("IoT", "12345678")

# Connect to an adafruit IO account
adaio = iot.AdaIO()
adaio.connecttoadafruitio("STEMNerd", "aio_UZBB56f7VTIDWyIyHX1BCEO1kWEd")

# Clear the display on the quarky
quarky.cleardisplay()

# Loop forever
while True:
  # Check if the wifi is connected
  if wifi.iswificonnected():
    
    # Get the data from Adafruit IO to see if the light is on
    if (adaio.getdata("Light") == "ON"):
      # Get the color data from Adafruit IO
      adaio.getcolordata("Color")
      # Get the RGB values from the color data
      Color = [adaio.getRGB(1), adaio.getRGB(2), adaio.getRGB(3)]

      # Loop through the 5 rows and 7 columns on the quarky
      for i in range(1, 6):
        for j in range(1, 8):
          # Set the LED on the quarky to the Color with a brightness of 20
          quarky.setled(j, i, Color, 20)

    # If the light is not on, clear the display on the quarky
    else:
      quarky.cleardisplay()

    # Pause the program for 1 second
    time.sleep(1)
    
  else:
    # Set LED 1 to red
    quarky.setled(1, 1, [255, 0, 0], 100)

Troubleshooting:

  1. If the Red Cross sign comes, Quarky, Python error has occurred. Check the serial monitor and try to reset the Quarky. Also, ensure that the WiFi you are trying to connect to is available.
Read More
Learn to control Mars Rover using Dabble App on your device with customized functions for specialized circular motions.

Introduction

In this activity, we will control the Mars Rover according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mars Rover.

Code

The following blocks represent the different functions that are created to control the Mars Rover for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Left, Right )

We will create our custom functions for specialized Circular motions of Mars Rover. We will use the Cross, Square, Circle, and Triangle buttons to control the Circular motions of Mars Rover.

Note: You will have to add the extensions of Mars Rover and also of Dabble to access the blocks.

The main code will be quite simple consisting of nested if-else loops to determine the action when a specific button is pressed on the Dabble Application.

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Rover to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the rover with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Rover. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, click on the “Upload Code” option to upload the code.
  5. You will have to add the block “When Quarky Starts Up” rather than the conventional “When Green Flag is Clicked” for the code to run.

Output

Forward-Backward Motions:

Right-Left Motions:

Circular Left Motion:

Circular Right Motion:

Read More
All articles loaded
No more articles to load
[PictoBloxExtension]