Table of Contents
[BlocksExtension]

touching color () ?

Description

The block checks whether its sprite is touching a specified color. If it is, the block returns “true”.

Example

This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies the hand gestures and makes the Mars Rover move accordingly.

This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies hand gestures and makes the Mars Rover move accordingly.

We are going to use the Hand Classifier of the Machine Learning Environment. The model works by analyzing your hand position with the help of 21 data points.

Hand Gesture Classifier Workflow

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Hand Pose Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of the accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Logic

The Mars Roverwill move according to the following logic:

  1. When the forward gesture is detected – Mars Rover will move forward.
  2. When the backward gesture is detected –Mars Rover will move backward.
  3. When the left gesture is detected –Mars Rover  will turn left.
  4. When the right gesture is detected – Mars Rover will turn right.

Code

Logic

  1. First, we will initialize different Gesture classes.
  2. Then, we will open the recognition window, which will identify different poses and turn on the camera with a certain level of transparency to identify images from the stage.
  3. If the identified class from the analyzed image is “forward,” the Mars Rover will move forward at a specific speed.
  4. If the identified class is “backward,” the Mars Rover will move backward.
  5. If the identified class is “left,” the Mars Rover will move left.
  6. If the identified class is “right,” the Mars Rover will move right.
  7. Otherwise, the Mars Rover will be in the home position.

Output


Read More
Learn to control Mecanum Pick and Place Robot using Dabble App on your device with customized functions for different motions and activities.

Introduction

In this activity, we will control the Mecanum Pick and Place according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mecanum Pick and Place.

Code

The following blocks represent the different functions that are created to control the Mecanum Pick and Place for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Lateral Left, Lateral Right ). We will use custom functions to control the Pick and Place actions. We will use the Triangle button to pick with the help of arms and the Circle button to initiate the placing action (dropping down the object). We will use the Cross button to rotate to the right direction and we will use the Square button to rotate to the left direction. We can use the Select button to stop the Mecanum whenever possible.

Note: You can always customize each and every function and button, and make your own activities easily. You will have to add the extensions of Mecanum and also of Dabble to access the blocks. To access the basic extensions required, make sure to select the Board as Quarky first.

Initialization

Main Code

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Mecanum to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the Mecanum with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Mecanum. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, click on the “Upload Code” option to upload the code.
  5. You will have to add the block “When Quarky Starts Up” rather than the conventional “When Green Flag is Clicked” for the code to run.

Output

Forward-Backward Motion:

Circular Right-Left Motion:

Lateral Right-Left Motion:

Pick and Place Mechanism with Dabble:

Read More
Learn to control Mecanum Pick and Place Robot using Dabble App on your device with customized functions for specialized motions using the Python Interface of the Pictoblox Software.

Introduction

In this activity, we will control the Mecanum Pick and Place according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mecanum Pick and Place.

Code

 

The following blocks represent the different functions that are created to control the Mecanum Pick and Place for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Lateral Left, Lateral Right ). We will use custom functions to control the Pick and place actions. We will use the Triangle button to initiate the Pick action and the Circle button to initiate the place action. We will use the Cross button to rotate to the right direction and we will use the Square button to rotate to the left direction. We can use the Select button to stop the Mecanum whenever possible.

Note: You can always customize each and every function and button, and make your own activities easily. You will have to add the extensions of Mecanum and also of Dabble to access the functions. To access the basic extensions required, make sure to select the Board as Quarky first. Select the Python Coding Environment and on the top right click on the Upload Mode only for the code to work properly.

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Mecanum to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the Mecanum with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Mecanum. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Go to the Block Coding Environment and select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, you can shift to the Python Mode and upload the code you have written. The upload button can be seen to the right section of the terminal as shown below.

Output

Forward-Backward Motion:

Circular Right-Left Motion:

Lateral Right-Left Motion:

Pick and Place Mechanism with Dabble:

Read More
The example shows how to use a hand pose classifier in PictoBlox to make the Sign Classifier Bot.

Introduction

In this example project, we are going to create a machine learning model that can classify different sign messages from the camera feed or image.

Hand Gesture Classifier in Machine Learning Environment

The Hand Gesture Classifier is the extension of the ML Environment used for classifying different hand poses into different classes.

 

Hand Gesture Classifier Workflow

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the Hand Gesture Classifier” extension. Click the “Create Project” button to open the Pose Classifier window.
  5. You shall see the Gesture Pose Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Pose Classifier

Class is the category in which the Machine Learning model classifies the poses. Similar posts are put in one class.

Class in Hand Pose Classifier

Class is the category in which the Machine Learning model classifies the hand poses. Similar hand poses are put in one class.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:

      Note: You can edit the capture setting in the camera with the following. Hold to Record allows you to capture images with pose till the time button is pressed. Whereas when it is off you can set the start delay and duration of the sample collection.

      If you want to change your camera feed, you can do it from the webcam selector in the top right corner.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

However, before training the model, there are a few hyperparameters that you should be aware of. Click on the “Advanced” tab to view them.

Note: These hyperparameters can affect the accuracy of your model to a great extent. Experiment with them to find what works best for your data.

Note: Hover your mouse over the question mark next to the hyperparameters to see their description.

It’s a good idea to train a numeric classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The x-axis of the graph shows the epochs, and the y-axis represents the corresponding accuracy. The range of the accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Script

The idea is simple, we’ll add one image of  each class in the “costume” column by making one new sprite which will we display on the stage according to input from user. we’ll also change name of the image according to sign class type.

  1. Add one sign image as another sprite and upload at-least one image of all sign classes on costume.
  2. Now, come back to the coding tab and select the Tobi sprite.
  3. We’ll start by adding a when flag clicked block from the Events palette.
  4. Add the “open recognition window” block from the Machine Learning palette.
  5. Add an “forever” block from the Control palette.
  6. Add the “if () then” block from the control palette for checking the user’s input.
  7. In the empty place of the “if () then” block, add an “is identified class ()” block from the Machine Learning palette. Select the appropriate class from the options.
  8. Inside the “if () then” block, add an “say ()” block from the Looks palette block. Write an appropriate statement in an empty place.
  9. Inside the “if () then” block, add a “broadcast ()” block from the Events palette block. Select the “New message” option and write an appropriate statement for broadcasting a message to another sprite.
  10. Repeat “if () then” block code for other classes, make appropriate changes in copying block code according to other classes, and add code just below it.
  11. Final code of “Tobi” sprite is
  12. Now click on another sprite and write code.
  13. We’ll start writing code for this sprite by adding a when flag is clicked block from the Events palette.
  14. Add the “hide” block from the Looks pallet.
  15. Write a new code in the same sprite according to class and add the “when I receive ()” block from the Events palette. Select the appropriate class from the options.
  16. Add the “show” block from the Looks pallet.
  17. Add the “switch costume to ()” block from the Looks palette. Select the appropriate class from the options.
  18. Repeat the same code for other classes and make changes according to the class.

Final Output

 

Read More
Explore the capabilities of a sign detector robotic arm that can recognize and understand signs or signals in its surroundings.

Introduction

sign detector robotic arm is a smart robot that can recognize and understand signs or signals in its surroundings. It uses cameras and other sensors to capture visual information and computer algorithms to analyze the signs. The robot can learn different types of signs through machine learning techniques. Once a sign is identified, the robotic arm can perform specific actions based on what the sign means. These robotic arms have many uses, such as helping in healthcare, manufacturing, transportation, and assisting people with communication disabilities. They are an exciting advancement in human-robot interaction, allowing robots to understand and respond to signs, expanding their abilities and applications.

Code

sprite = Sprite('Tobi')

recocards = RecognitionCards()
recocards.video("on flipped")
recocards.enablebox()
recocards.setthreshold(0.6)
roboticArm = RoboticArm(1,2,3,4)

roboticArm.sethome()
while True:
  recocards.analysecamera()
  sign = recocards.classname()
  sprite.say(sign + ' detected')
  if recocards.count() > 0:
	  if 'Turn Left' in sign:
		  roboticArm.movebyinoneaxis(10,"X",1000)

	  if 'Turn Right' in sign:
		  roboticArm.movebyinoneaxis(-10,"X",1000)
	
	  if 'Go' in sign:
		  roboticArm.movebyinoneaxis(10,"Y",1000)
			
	  if 'U Turn' in sign:
		  roboticArm.movebyinoneaxis(-10,"Y",1000)

Logic

  1. Open the Pictoblox application.
  2. Select the block-based environment.
  3. Click on the Recognition Cards and robotic arm extension available in the left corner.
  4. Initialize the video on stage and set the transparency as 0%.
  5. Drag and drop the forever loop to continue to initialize the image from the stage, and get the input from the camera.
  6. Show the bounding box around the sign detected from the stage.
  7. A RecognitionCards object is created and assigned to the variable ‘recocards‘. This object represents the functionality related to recognizing and analyzing signs or signals.
  8. We configures the video functionality of the RecognitionCards object. It sets the video to be flipped horizontally, ensuring that the signs appear correctly when displayed on the stage.
  9. Now the program activates the feature that displays a bounding box around the detected sign on the stage using the enablebox() function. This helps visually identify the location of the sign.
  10. Further sets the threshold value to 0.6 for the RecognitionCards object. The threshold determines the minimum confidence level required for a sign to be recognized. Any sign with a confidence level below 0.6 will not be considered valid.
  11. Then A RoboticArm object is created and assigned to the variable ‘roboticArm’. The numbers 1, 2, 3, and 4 represent the specific configuration or parameters of the robotic arm.The object allows control over the robotic arm’s movements.
  12. The robotic arm to move to a predefined home position using home(). The home position is a reference point or starting point for the arm’s movements.
  13. We create a while loop that will continuously execute the code block.
  14. Then we capture and analyzes the video input from the camera. The RecognitionCards object processes the captured frame to detect and recognize signs.
  15. We check if there is at least one sign detected by the RecognitionCards object. If the count of detected signs is greater than 0, the following code block will execute.
  16. If the detected sign ‘Turn Left‘, the robotic arm will move by 10 units in the X-axis direction over a duration of 1000 milliseconds. This function call controls the movement of the robotic arm.
  17. If the detected sign ‘Turn Right‘, the robotic arm will move by -10 units (indicating the opposite direction) in the X-axis direction over a duration of 1000 milliseconds.
  18. If the detected sign ‘Go‘, the robotic arm will move by 10 units in the Y-axis direction over a duration of 1000 milliseconds.
  19. If the detected sign ‘U Turn‘, the robotic arm will move by -10 units in the Y-axis direction over a duration of 1000 milliseconds.

Output

Read More
Discover how gesture-controlled grippers are transforming human-robot interactions.

Introduction

Gesture-controlled grippers revolutionize the way humans interact with robotic systems by allowing the manipulation of robotic grippers through hand gestures. This innovative technology leverages computer vision and machine learning algorithms to interpret and respond to specific hand movements in real-time. By recognizing and analyzing gestures, the system translates them into commands for the gripper, providing users with a natural and intuitive interface for controlling its actions.

Gesture-controlled grippers have wide-ranging applications across the manufacturing, logistics, healthcare, and robotics industries. With a simple wave, pinch, or swipe of the hand, users can trigger actions like grasping, releasing, and repositioning objects the gripper holds.

Hand Gesture Classifier in Machine Learning Environment

The Hand Gesture Classifier is the extension of the ML Environment used for classifying different hand poses into different classes.

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  5. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Hand Pose Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.
Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.
Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Python Coding Environment if you have opened the ML Environment in Block Coding.

Code

Output

Read More
Discover how gesture-controlled grippers are transforming human-robot interactions.

Introduction

Gesture-controlled grippers revolutionize the way humans interact with robotic systems by allowing the manipulation of robotic grippers through hand gestures. This innovative technology leverages computer vision and machine learning algorithms to interpret and respond to specific hand movements in real-time. By recognizing and analyzing gestures, the system translates them into commands for the gripper, providing users with a natural and intuitive interface for controlling its actions.

Gesture-controlled grippers have wide-ranging applications across the manufacturing, logistics, healthcare, and robotics industries. With a simple wave, pinch, or swipe of the hand, users can trigger actions like grasping, releasing, and repositioning objects the gripper holds.

 

Hand Gesture Classifier in Machine Learning Environment

The Hand Gesture Classifier is the extension of the ML Environment used for classifying different hand poses into different classes.

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  5. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

 

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Hand Pose Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

 

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.

 

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

 

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Python Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Python Coding Environment if you have opened the ML Environment in Python Coding.

Code

####################imports####################
# Do not change

import numpy as np
import tensorflow as tf
import time

roboticArm = RoboticArm(1,2,3,4)
# Do not change
####################imports####################

#Following are the model and video capture configurations
# Do not change

model=tf.keras.models.load_model(
    "num_model.h5",
    custom_objects=None,
    compile=True,
    options=None)
pose = Posenet()                                                    # Initializing Posenet
pose.enablebox()                                                    # Enabling video capture box
pose.video("on",0)                                                  # Taking video input
class_list=['Open','Close']                  # List of all the classes

def runQuarky(predicted_class):
  if predicted_class == "Open":
    roboticArm.controlgripper("open")
        
  if predicted_class == "Close":
    roboticArm.controlgripper("close")

	
# Do not change
###############################################

#This is the while loop block, computations happen here
# Do not change

while True:
  pose.analysehand()                                             # Using Posenet to analyse hand pose
  coordinate_xy=[]
    
    # for loop to iterate through 21 points of recognition
  for i in range(21):
    if(pose.gethandposition(1,i,0)!="NULL"  or pose.gethandposition(2,i,0)!="NULL"):
      coordinate_xy.append(int(240+float(pose.gethandposition(1,i,0))))
      coordinate_xy.append(int(180-float(pose.gethandposition(2,i,0))))
    else:
      coordinate_xy.append(0)
      coordinate_xy.append(0)
            
  coordinate_xy_tensor = tf.expand_dims(coordinate_xy, 0)        # Expanding the dimension of the coordinate list
  predict=model.predict(coordinate_xy_tensor)                    # Making an initial prediction using the model
  predict_index=np.argmax(predict[0], axis=0)                    # Generating index out of the prediction
  predicted_class=class_list[predict_index]                      # Tallying the index with class list
  print(predicted_class)
  
  runQuarky(predicted_class)  
  # Do not change

Logic

  1. If the value of the detected_class variable is “Open“, then the robotic arm’s gripper will open.
  2. If the value of the detected_class variable is “Close“, then the robotic arm’s gripper will Close.
def runQuarky(predicted_class):
  if predicted_class == "Open":
    roboticArm.controlgripper("open")
        
  if predicted_class == "Close":
    roboticArm.controlgripper("close")

Output

Read More
This project uses face recognition and an IR sensor to identify and authenticate authorized people and open the door accordingly. Learn how to create this system with Python code and hardware.

The project uses face recognition to identify authorized people and opens the door accordingly.

Circuit

We are using 2 devices in this project:

  1. IR Sensor: The IR sensor provides information if there is an obstacle in front or not. The IR sensor connections are as follows:
    1. GND Pin connected to GND of the Quarky Expansion Board.
    2. VCC Pin connected to VCC of the Quarky Expansion Board.
    3. Signal Pin connected to D3 of the Quarky Expansion Board.
  2. Servo Motor: The servo motor controls the Door of the IoT house which is connected to the servo port of 5 of the Quarky Expansion Board.

Alert: Make sure you have the Door Servo Motor calibrated.

Face Recognition

We will be using Face Detection extension for making the face recognition application.

Storing the Face Authorised for IoT House

This code is used to add a new face to a system:

  1. The first step is to create a Sprite object with the nameTobi‘. Then, a Face Detection object is created. The time library is imported.
  2. Then, a function called addFace() is defined. This function allows us to add a new face to the system. First, the video feed from the camera is turned on. Then, the camera is analyzed for a face. If one face has been detected, the user is asked to select a slot (1 to 10) and enter a name for the new face which is then added to the system. Finally, the video feed from the camera is turned off.
  3. The code runs a loop, which checks if thea key has been pressed. If it is, the addFace() function is called.
#Create a new Sprite object with the name 'Tobi'
sprite = Sprite('Tobi')

#Create a new Face Detection object
fd = FaceDetection()

#Import the time library
import time

#Set the threshold for face detection to 0.5
fd.setthreshold(0.5)
#Turn off the video feed from the camera
fd.video("off", 0)
#Enable the box to be drawn around the detected face
fd.enablebox()

#Define a function that adds a new face to the system
def addFace():
  #Create a flag to keep track if a new face has been added
  faceFlag = 0

  #Turn on the video feed from the camera
  fd.video("on", 0)
  time.sleep(1)

  #Keep looping until a new face has been added
  while faceFlag == 0:
    #Analyse the camera for a face
    fd.analysecamera()

    #Check if one face has been detected
    if fd.count() == 1:
      #Ask the user which slot the face should be added to
      sprite.input("Select the slot (1 to 10)?")
      #Store the slot number the user provided
      faceSlot = sprite.answer()

      #Ask the user to enter a name for the new face
      sprite.input("Enter the name of the face")
      #Store the name the user provided
      faceName = sprite.answer()

      #Add the face to the system with the provided slot number and name
      fd.addclassfromcamera(faceSlot, faceName)

      #Set the faceFlag to 1 to stop the loop
      faceFlag = 1

  #Turn off the video feed from the camera
  fd.video("off", 0)

#Keep running the loop forever
while True:
  #Check if the 'a' key has been pressed
  if sprite.iskeypressed("a"):
    #If yes, call the addFace() function
    addFace()

Working of an IR Sensor

An Infrared sensor is a type of sensor that senses if something is close to it or not. The IR stands for Infrared sensor. Infrared is the light out of our visible spectrum.

An IR sensor has a white LED (transmitter) and a photodiode (receiver). The transmitter emits IR light, and the receiver detects reflected light from objects within the sensor’s range, which can be adjusted with a potentiometer. The sensor is indicated by two LED indicators, a power LED which is always on, and a signal LED which is on when an object is detected and off when nothing is detected.

The signal LED has two states or situations:

  1. ON (Active) when it detects an object
  2. OFF (Inactive) when it doesn’t detect any object

Python Code for IR & Face Recognition-Based Smart Door Opening System

This code creates a program that can add a new face to the system, and then recognize and authenticate the user:

  1. It libraries of Sprite, FaceDetection, Quarky, Expansion, and IoTHouse to perform these tasks. It also imports the time library for timing purposes.
  2. The program sets the threshold for face detection to 0.5, turns off the video feed from the camera, and enables the box to be drawn around the detected face.
  3. It also moves a servo on the expansion board to position 5 and moves it to 100 degrees to close the door.
  4. It defines two functions called addFace() and authenticate().
  5. The authenticate() function turns on the video feed from the camera, recognizes the face in the camera, and speaks out the name of the recognized user if the face has been recognized. It then returns 1 to indicate the user has been authenticated.
  6. The program then keeps running the loop forever. It checks if thea key has been pressed and if yes, calls the addFace() function.
  7. It also checks if the IR sensor is active and if yes, calls the authenticate() function. If the user has been authenticated, it moves the servo to 0 degrees to open the door and then back to 100 degrees to close the door after some time.
#Create a new Sprite object with the name 'Tobi'
sprite = Sprite('Tobi')

#Create a new Face Detection object
fd = FaceDetection()

#Import the time library
import time

#Create a new Quarky object
quarky = Quarky()

#Create a new Expansion object
expansion = Expansion()
house = IoTHouse()

#Set the threshold for face detection to 0.5
fd.setthreshold(0.5)
#Turn off the video feed from the camera
fd.video("off", 0)
#Enable the box to be drawn around the detected face
fd.enablebox()

#Move a servo on the expansion board to position 5 and move it to 100 degrees
expansion.moveservo(5, 100);

#Define a function that adds a new face to the system
def addFace():
  #Create a flag to keep track if a new face has been added
  faceFlag = 0

  #Turn on the video feed from the camera
  fd.video("on", 0)
  time.sleep(1)

  #Keep looping until a new face has been added
  while faceFlag == 0:
    #Analyse the camera for a face
    fd.analysecamera()

    #Check if one face has been detected
    if fd.count() == 1:
      #Ask the user which slot the face should be added to
      sprite.input("Select the slot (1 to 10)?")
      #Store the slot number the user provided
      faceSlot = sprite.answer()

      #Ask the user to enter a name for the new face
      sprite.input("Enter the name of the face")
      #Store the name the user provided
      faceName = sprite.answer()

      #Add the face to the system with the provided slot number and name
      fd.addclassfromcamera(faceSlot, faceName)

      #Set the faceFlag to 1 to stop the loop
      faceFlag = 1

  #Turn off the video feed from the camera
  fd.video("off", 0)


#Define a function that authenticates the user
def authenticate():
  #Turn on the video feed from the camera
  fd.video("on", 0)
  time.sleep(1)

  #Recognise the face in the camera
  fd.recognisefromstage()

  #Check if one or more face has been detected
  if fd.count() > 0:
    #Loop through all the detected faces
    for i in range(1, fd.count() + 1):
      #Check if the face has been recognised
      if fd.getclassname(i) != "unknown":
        #Speak out the name of the recognised user
        sprite.say("Authorised - " + fd.getclassname(i), 2)

        #Turn off the video feed from the camera
        fd.video("off", 0)

        #Return 1 to indicate the user has been authenticated
        return 1
  
  #Turn off the video feed from the camera
  fd.video("off", 0)
  #Return 0 to indicate the user has not been authenticated
  return 0

#Keep running the loop forever
while True:
  #Check if the 'a' key has been pressed
  if sprite.iskeypressed("a"):
    #If yes, call the addFace() function
    addFace()

  #Check if the space key has been pressed
  if house.irstatus("D3"):
    #If yes, call the authenticate() function
    if authenticate() == 1:
      #Move the servo to 0 degrees
      expansion.moveservo(5, 0)
      time.sleep(2)
      #Move the servo back to 100 degrees
      expansion.moveservo(5, 100)

 

Output

Read More
Learn how oscillators are utilized to create seamless movements in Quadruped robots.

Introduction

In this example, you will understand how the oscillator concept is used to create smooth motions for the Quadruped robot. The oscillator is the primary component for making the smooth movements of Quarky Quadruped like walking or turning.

How does the Oscillator work?

The purpose of the oscillator in the code is to generate a sinusoidal waveform that can be used to control the motion of a servo motor. The parameters of the oscillator are defined by the offset, amplitude, period, and phase difference.

  1. Offset: The offset is the starting angle of the servo motor (oscillator). It is the angle at which the servo motor starts moving.
  2. Amplitude: The amplitude of the servo motor (oscillator) is the maximum angle the servo motor can rotate.
  3. Period: The period is the total time taken by the oscillator to complete one full cycle.
  4. Phase Difference: The phase difference is the angular displacement of the oscillator from its starting point.

In mathematical terms, the servo angle is calculated using the following formula:

Angle = Offset + Amplitude * sin(Time / Timeperiod + Phasediff)

Single Servo Oscillation

Let’s apply the concept of oscillation to the Quadruped. We want the front right hip servo to oscillate like this:

As you can observe, the following are the oscillator parameters that can be used to get the desired motion:

  1. Offset: 45 degrees
  2. Amplitude: 45 degrees
  3. Time period: 1000
  4. Phase Difference: 0 degrees

Look at the parameters carefully and see if you can understand how it works.

Now to execute the following on Quarky, we will the set () amplitude () offset () period () phase difference () which sets the oscillator parameters for the selected servo motor.

Next, we will use oscillate for cycles () block to execute the oscillator for the complete cycle for the specified cycle times.

Create the following script:

Click on the green flag to test the script:

As you can observe the servo motor start from 45 degrees and do 1 oscillation. You can observe the servo angle here:

Single Servo Oscillation with Phase Difference

Let’s see how to use the Phase Difference to delay the move.

Create the following script:

Click on the green flag to test the script:

As you can observe the servo motor start from 90 degrees and do 1 oscillation. You can observe the servo angle here:

Hope you have understood the oscillator. Let’s change the difficulty level and try the oscillator on all servo motors.

All Servo Oscillator

Create the script to make the left-right motion:

Let’s decode it. Let us play the motion while keeping the Quadruped in the air.

As you can observe the following movements: All the hip joints are starting from the 45-degree angles and then oscillate. The script of the following is here:

Run the green flag and test the code.

Try Other Oscillator Motions

  1. Code 1:
  2. Code 2:

Try to change the parameters and create your actions.

Read More
Learn how to use Pose Classifier, an extension of ML Environment. Follow the step-by-step tutorial on using image classifier in block coding.

Introduction

The pose Classifier is the extension of the ML Environment used for classifying different body poses into different classes.

The model works by analyzing your body position with the help of 17 data points.

Pose Classifier Workflow

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the “Pose Classifier” extension. Click the “Create Project” button to open the Pose Classifier window.
  5. You shall see the Pose Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Pose Classifier

Class is the category in which the Machine Learning model classifies the poses. Similar posts are put in one class.

There are 2 things that you have to provide in a class:

  1. Class Name: The name to which the class will be referred.
  2. Pose Data: This data can be taken from the webcam or uploaded from local storage.
Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
  3. We use two classes, “up” and “down,” in our code, as depicted in the picture.
Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to predict previously unseen data.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Code Explanation

  1. First, select the Humanoid extension from the palette.
  2. Drag and drop the “forever” block to create a continuous loop.
  3. Drag and drop the “if” and “else” blocks from the control palette. If the upper-class hand is detected in a pose, the Humanoid will move its left hand to 180 degrees and its right hand to 0 degrees, imitating a human-like pose.
  4. If the “down” class is indicated on the screen, PictoBlox will prompt saying “down” and the Humanoid will move its left and right hand to mimic a human pose, with a rotation of 90 degrees each.
  5. Otherwise, the Humanoid will assume a home position, remaining still with no movement.

Code

Logic

  1. The first model is used to identify the pose of a human, presumably using pose estimation techniques.
  2. Two classes, “up” and “down,” are added to represent the different angles of a person’s hand position.
  3. The model is trained using labeled data to learn to predict the class (up or down) based on the input image or video frame.
  4. The trained model is then used to predict the class of an image or video frame captured from a webcam, indicating the current position of the person.
  5. The code includes an if-else condition to handle the predicted class. If the model identifies the person’s pose as “up,” the Humanoid will mimic the same position and angle of the person’s hand.
  6. If the model identifies the person’s pose as “down, the Humanoid will set its hand angle to a down position.

Output

Read More
Learn how to create custom sounds to control Quadruped with the Audio Classifier of the Machine Learning Environment in PictoBlox.

Introduction

A Sound-Based Quadruped with Machine Learning refers to a Quadruped robot that can perceive and interact with its environment through sound-based sensing and uses machine-learning techniques to process and analyze the auditory data it receives.
Quadruped robots with machine learning have the potential to greatly enhance the way we interact with machines and each other, making communication more natural and intuitive while also enabling new applications in fields such as healthcare, education, and entertainment.
In this activity, we will use the Machine Learning Environment of the Pictoblox Software. We will use the Audio Classifier of the Machine Learning Environment and create our custom sounds to control the Quadruped.

Audio Classifier Workflow

Follow the steps below to create your own Audio Classifier Model:

  1. Open PictoBlox and create a new file.
  2. Select the Block coding environment as the appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. A new window will open. Type in an appropriate project name of your choice and select the “Audio Classifier” extension. Click the “Create Project” button to open the Audio Classifier Window.
  5. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.
  6. As you can observe in the above image, we will add two classes for audio. We will be able to add audio samples with the help of the microphone. Rename class 1 as “Clap” and class 2 as “Snap”.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Microphone.
  3. You will be able to add the audio sample in each class and make sure you add at least 20 samples for the model to run with good accuracy.
  4. Add the first class as “clap”  and record the audio for clap noises through the microphone.
  5. Add the second class as “snap” and record the audio for snap noises through the microphone.

Note: You will only be able to change the class name in the starting before adding any audio samples. You will not be able to change the class name after adding the audio samples in the respective class.

Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model simply, use the microphone directly and check the classes as shown in the below image:

You will be able to test the difference in audio samples recorded from the microphone as shown below:

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

 

The Quadruped will move according to the following logic:

  1. When the audio is identified as “clap” sound– Quadruped will move forward.
  2. When the “snap” sound is detected –Quadruped will move backward.


Note: You can add even more classes with different types of differentiating sounds to customize your control. This is just a small example from which you can build your own Sound Based Controlled Quadruped in a very easy stepwise procedure.

Code

Logic

  1. First, initialize the Quadruped extension.
  2. Then, initialize a forever loop to continuously loop and analyze the camera from the stage.
  3. If the program detects a clap sound, the Quadruped will move forward at a specific speed.
  4. Similarly, if it identifies a snap sound, the Quadruped will move backward at a specific speed.
  5. Otherwise, the Quadruped will remain in its initial position (home position).

Output

Read More
The example shows how to use a audio classifier in PictoBlox to make the Bird Audio Classifier Bot.

Introduction

In this example project, we are going to create a machine learning model that can classify different audio messages of birds from the microphone feed of computer.

Audio Classifier in Machine Learning Environment

The Audio Classifier is the extension of the ML Environment used for classifying different birds voice.

Audio Classifier Workflow

Follow the steps below to create your own Audio Classifier Model:

  1. Open PictoBlox and create a new file.
  2. Select the Block coding environment as the appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. A new window will open. Type in an appropriate project name of your choice and select the “Audio Classifier” extension. Click the “Create Project” button to open the Audio Classifier Window.
  5. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.
  6. As you can observe in the above image, we will add many classes for audio. We will be able to add audio samples with the help of the microphone.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Microphone.
  3. You will be able to add the audio sample in each class and make sure you add at least 20 samples for the model to run with good accuracy.

Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model simply, use the microphone directly and check the classes as shown in the below image:

You will be able to test the difference in audio samples recorded from the microphone as shown below:

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Script

The idea is simple, we’ll add one image of  each class in the “costume” column by making one new sprite which will we display on the stage according to input from user. we’ll also change name of the image according to bird class type.

  1. Add one bird image as another sprite and upload at-least one image of all bird classes on costume.
  2. Now, come back to the coding tab and select the Tobi sprite.
  3. We’ll start by adding a when flag clicked block from the Events palette.
  4. Add the “open recognition window” block from the Machine Learning palette.
  5. Add an “when () is predicted” block from the Machine Learning palette. Select the appropriate class from the options.
  6. Add an “say () for () seconds” block from the Looks palette block. Write an appropriate statement in an empty place.
  7. Repeat the same code for other classes and make changes according to the class.
  8. For “BackNoise” class, don’t add any statement at place of empty space of “say () for () seconds” block
  9. Final code of “Tobi” sprite is
  10. Now click on another sprite and write code.
  11. We’ll start writing code for this sprite by adding a “when () is predicted” block from the Machine Learning palette.
  12. Add the “switch costume to ()” block from the Looks palette. Select the appropriate class from the options.
  13. Repeat the same code for other classes and make changes according to the class.
  14. Final code of another sprite is

Final Output

Read More
Learn about potentiometers, their working principle, and applications as variable resistors or voltage dividers.

Potentiometer and It’s working

A potentiometer is a versatile three-terminal resistor that forms an adjustable voltage divider or variable resistor (rheostat). It consists of two terminals connected to a resistive element and a third terminal connected to an adjustable wiper. The potentiometer can vary the amount of resistance in the circuit based on the wiper’s position.

Circuit Diagram

Code

  1. Create a variable called “brightness” and set it to 0.
  2. Add the “forever” block from the control palette.
  3. Use the “map” block from the Arduino palette to convert the range of values (0-1023) to the desired range (0-255) for brightness control. Read the values from Arduino pin A0 and place them in the first space of the map function.
  4. Set the brightness variable to the mapped value using the “set brightness to” block.
  5. Use any PWM pin of the Arduino to connect an LED.
  6. Set the PWM value to the brightness variable.
  7. Add these two blocks inside the forever block.
  8. Finally, add when flag clicked event to complete the script.

Script

Output

Read More
Learn how to interface an 8×8 LED dot matrix with an Arduino Uno board. Follow our step-by-step guide to set up the circuit diagram and implement the code

8*8 Dot Matrix with Arduino

In this project, we will explore how to effectively use the 8×8 LED dot matrix with an Arduino Uno board. The 8×8 LED matrix contains 64 LEDs arranged in an 8×8 grid, forming a versatile display module. By connecting each row and column to digital pins, we can control the LED matrix and showcase a wide array of patterns, emojis, and animations. Additionally, cascading multiple dot matrices together enables us to expand the display without the need for extra pins.

Circuit Diagram

Code

  1. Add when Arduino starts up.
  2. Initialize the display module and pin connections with the Arduino.
  3. From control palette add forever block.
  4. To display any emoji or pattern use the display block.
  5. Chose any of the emoji or emotion and add in the forever loop and add a time delay.
  6. Again chose one different emoji or emotion and add a time delay.
  7. With this your script is complete and upload the C++ code in the Arduino using the upload button.

Script

Output

Read More
This project demonstrates how to interface an RFID sensor with a Quarky to control the door of an IoT-enabled house using an authorized RFID tag.

This project demonstrates how to interface an RFID sensor with a Quarky to control the door of an IoT-enabled house using an authorized RFID tag.

RFID to Quarky Circuit

Note: We are connecting the RFID sensor directly to Quakry Board.

RFID is short for “radio-frequency identification” and points to a technology whereby a reader catches digital information encoded in RFID tags. RFID sensors have a lot of pins. You have to connect it according to the following:

  1. The GND of the RFID sensor is connected to the GND of Quakry.
  2. The 3.3V of the RFID sensor is connected to the V of Quakry.
  3. The SDA Pin of the RFID sensor is connected to the A2 Pin of Quakry.
  4. The SCK Pin of the RFID sensor is connected to the D1 Pin of Quakry.
  5. The MOSI Pin of the RFID sensor is connected to the D2 Pin of Quakry.
  6. The MISO Pin of the RFID sensor is connected to the D3 Pin of Quakry.

The servo motor is connected to the S1 of Quarky.

Making RFID Master Tag

The following code makes any RFID Tag a master card that can be authorized for security.

This is how it looks:

Code

The following code checks the RFID tag and opens the door.

Output

Read More
This project demonstrates how to interface an RFID sensor with a Quarky to control the door of an IoT-enabled house using an authorized RFID tag. Learn how to write the code for RFID authentication and see the circuit diagram for connecting the RFID sensor to the Quarky board.

This project demonstrates how to interface an RFID sensor with a Quarky to control the door of an IoT-enabled house using an authorized RFID tag.

RFID to Quarky Circuit

Note: We are connecting the RFID sensor directly to Quakry Board.

RFID is short for “radio-frequency identification” and points to a technology whereby a reader catches digital information encoded in RFID tags. RFID sensors have a lot of pins. You have to connect it according to the following:

  1. The GND of the RFID sensor is connected to the GND of Quakry.
  2. The 3.3V of the RFID sensor is connected to the V of Quakry.
  3. The SDA Pin of the RFID sensor is connected to the A2 Pin of Quakry.
  4. The SCK Pin of the RFID sensor is connected to the D1 Pin of Quakry.
  5. The MOSI Pin of the RFID sensor is connected to the D2 Pin of Quakry.
  6. The MISO Pin of the RFID sensor is connected to the D3 Pin of Quakry.

The servo motor is connected to the S1 of Quarky.

Making RFID Master Tag

The following code makes any RFID Tag a master card that can be authorized for security:

  1. Tobi is a sprite object that helps to create an RFID tag. A Quarky object and an IoTHouse object are also created.
  2. The IoTHouse object needs to be initialized and a flag (MasterFlag) is set to 0 to indicate if the RFID tag has been written yet.
  3. Tobi then asks the user for the name of the user for the RFID tag.
  4. A loop is created that will keep running until the RFID tag is written. Tobi will ask the user to put the RFID tag and then try to write it to the house.
    1. If it is successful, the MasterFlag is set to 1 and the master tag of the RFID is set. Finally, Tobi will let the user know the RFID tag is created.
    2. If the RFID tag couldn‘t be written, Tobi will ask the user to put the RFID tag again.
# Create a sprite object for 'Tobi'
sprite = Sprite('Tobi')

# Create a Quarky object
quarky = Quarky()

# Create an IoTHouse object
house = IoTHouse()

# Initialise the RFID tag
house.initialiserfid()

# Set a flag to indicate if the RFID tag has been written
MasterFlag = 0

# Ask the user for the name of the user for the RFID tag
sprite.input("What is the name of the user for this RFID tag?")

# Keep looping until the RFID tag is written
while MasterFlag == 0:
  # Ask the user to put the RFID tag
  sprite.say("Writing on RFID! Please put RFID tag.")
  
  # Try to write the RFID tag to the house
  if house.writetorfid(sprite.answer(), 2):
    # Set the MasterFlag to 1, indicating the RFID tag has been written
    MasterFlag = 1

    # Set the master tag of the RFID
    house.setmaster()

    # Let the user know the RFID tag is created
    sprite.say("RFID tag created", 2)

  # If the RFID tag couldn't be written
  else:
    # Ask the user to put the RFID tag again
    sprite.say("No tag detected, please put RFID tag", 2)

This is how it looks:

Code for RFID Authentication

This code makes the Quarky open the door when it reads a special RFID card:

  1. First, it imports the time library. Then, it creates a Quarky object and an IoTHouse object.
  2. The IoTHouse object is initialized with an RFID reader and it can read the RFID card.
  3. Then, it moves the servo of the door to 100 degrees to close the door.
  4. Inside the while loop, it checks if the RFID is read.
    1. If it is read then it checks if the data scanned is Quarky. If it is, it moves the servo to 0 and then draws a pattern on the Quarky Display. It then waits for two seconds and moves the servo back to 100. After that, it clears the display of the Quarky.
    2. If the scanned data is not Quarky, then it draws a different pattern on the Quarky object and waits for one second. After that, it clears the display of the Quarky.
# First, we import the time library
import time

# We also create a Quarky object
quarky = Quarky()

# We create an IoTHouse object called 'house'
house = IoTHouse()
# We initialise the RFID of the house object
house.initialiserfid()

# We move the servo of the Quarky object to 100
quarky.moveservo("Servo 1", 100)

# We create a while loop that will go on forever
while True:
  # Check if the RFID is read
  if house.readrfid(3):
    # Check if the scanned data is Quarky
    if (house.readscanneddata() == "Quarky"):
      # Move the servo to 0
      quarky.moveservo("Servo 1", 0)
      # Draw a pattern on the Quarky Display
      quarky.drawpattern("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa")
      # Sleep for 2 seconds
      time.sleep(2)
      # Move the servo to 100
      quarky.moveservo("Servo 1", 100)
      # Clear the display of the Quarky Display
      quarky.cleardisplay()
    
    # If the scanned data is not Quarky
    else:
      # Draw a different pattern on the Quarky object
      quarky.drawpattern("bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb")
      # Sleep for 1 second
      time.sleep(1)
      # Clear the display of the Quarky object
      quarky.cleardisplay()

Output

Read More
Learn how to create custom sounds to control Humanoid with the Audio Classifier of the Machine Learning Environment in PictoBlox.

Introduction

A Sound-Based Humanoid with Machine Learning refers to a Humanoid robot that can perceive and interact with its environment through sound-based sensing and uses machine-learning techniques to process and analyze the auditory data it receives.

Humanoid robots with machine learning have the potential to greatly enhance the way we interact with machines and each other, making communication more natural and intuitive while also enabling new applications in fields such as healthcare, education, and entertainment.

In this activity, we will use the Machine Learning Environment of the Pictoblox Software. We will use the Audio Classifier of the Machine Learning Environment and create our custom sounds to control the Humanoid.

Audio Classifier Workflow

Follow the steps below to create your own Audio Classifier Model:

  1. Open PictoBlox and create a new file.
  2. Select the Block coding environment as the appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. A new window will open. Type in an appropriate project name of your choice and select the “Audio Classifier” extension. Click the “Create Project” button to open the Audio Classifier Window.
  5. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.
  6. As you can observe in the above image, we will add two classes for audio. We will be able to add audio samples with the help of the microphone. Rename class 1 as “Clap” and class 2 as “Snap”.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Microphone.
  3. You will be able to add the audio sample in each class and make sure you add at least 20 samples for the model to run with good accuracy.
  4. Add the first class as “clap”  and record the audio for clap noises through the microphone.
  5. Add the second class as “snap” and record the audio for snap noises through the microphone.

Note: You will only be able to change the class name in the starting before adding any audio samples. You will not be able to change the class name after adding the audio samples in the respective class.

Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model simply, use the microphone directly and check the classes as shown in the below image:

You will be able to test the difference in audio samples recorded from the microphone as shown below:

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

 

The Humanoid will move according to the following logic:

  1. When the audio is identified as “clap”- Humanoid will move forward.
  2. When the “snap” sound is detected –the Humanoid will move backward.

Note: You can add even more classes with different types of differentiating sounds to customize your control. This is just a small example from which you can build your own Sound Based Controlled Humanoid in a very easy stepwise procedure.

Code

Logic

  1. First, initialize the Humanoid extension.
  2. Then, initialize a forever loop to continuously loop and analyze the camera from the stage.
  3. If the program detects a clap sound, the Humanoid will move forward at a specific speed.
  4. Similarly, if it identifies a snap sound, the Humanoid will move backward at a specific speed.
  5. Otherwise, the Humanoid will remain in its initial position (home position).

Output

Read More
The example shows how to use a text classifier in PictoBlox to make the Twitter Sentiment Analysis Bot using Text Classifier.

Introduction

In this example project, we are going to create a machine learning model that can classify the nature of a comment based on the typed comment input by the user.

Text Classifier in Machine Learning Environment

The Text Classifier is the extension of the ML Environment used for classifying nature of a comment or message which typed by user.

Text Classifier Workflow

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the “Text Classifier” extension. Click the “Create Project” button to open the Text Classifier window.
  5. You shall see the Text Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Text Classifier

Class is the category in which the Machine Learning model classifies the text. Similar texts are put in one class.

 

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data writing the text or by Uploading the files from the local folder.
    1. Uploading Dataset:

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the texts, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

 

Testing the Model

To test the model, simply enter the input values in the “Testing” panel.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Script

The idea is simple: the user has to write text in the input panel, and Tobi will tell about the nature of the text.

  1. We’ll start by adding a when flag clicked block from the Events palette.
  2. Add an “forever” block from the Control palette.
  3. Add an “ask () and wait” block from the Sensing palette. Write an appropriate statement in an empty place.
  4. Add an “say ()” block from the Looks palette block.
  5. Inside the say block add join () () block from operator palette.
  6. Inside the join block write statement at first empty place and at second empty place add get class of () block from the Machine Learning palette and at the empty place of get class of () block select an “answer” block from the Sensing palette.
  7. Add an “wait () seconds” block from the Control palette block.
  8. Final code of “Tobi” sprite is 

Final Output

 

Read More
In this example, we are going to learn how to program the Quarky to detect the ambient light reading and make the light turn ON and OFF accordingly.

In this example, we are going to learn how to program the Quarky to detect the ambient light reading and make the light turn ON and OFF accordingly.

LDR Sensor Connection to Quarky

LDR sensors have 4 pins: GND, VCC, DO, and AO. You have to connect the following 3 pins to the Quarky Expansion Board:

  1. GND to Ground Pin of Quarky Expansion Board
  2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
  3. AO to the A1 (Analog Pin) of the Quarky Expansion Board

Getting Sensor Reading

The following script displays the real-time sensor reading. The reading will vary from 0 to 4095.

Connect Quarky and you will start getting the readings.

Download PictoBlox Code: https://pictoblox.page.link/8Brek1NuDSeLBSsH9

You will notice from the reading that when you put your hands close to the sensor the value is higher. This is because the sensor receives less light and the voltage across the sensor increases. We will create a threshold to identify when the light is low.  For the above readings, it will be 2800. Now if the sensor value is greater than 2800, then it is active, meaning that the light is low. We will use it in the next code.

Automatic Light

The following code turns the lights ON of the Quarky when the light is low, else the lights are OFF.

Download PictoBlox Code: https://pictoblox.page.link/gS4skqbu9p7LYytYA

 

Uploading Code

You can also make the automatic lighting work independently of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

Click on the Upload Code button.

Alternative Code for Stage Mode

You can also use when () at () active block to make the same program. The hat block activates when the LDR is active. You must set the threshold for the code to work.

Read More
Learn how to program Quarky to detect the ambient light reading and make the light turn ON and OFF accordingly.

In this example, we are going to learn how to program the Quarky to detect the ambient light reading and make the light turn ON and OFF accordingly.

LDR Sensor Connection to Quarky

LDR sensors have 4 pins: GND, VCC, DO, and AO. You have to connect the following 3 pins to the Quarky Expansion Board:

  1. GND to Ground Pin of Quarky Expansion Board
  2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
  3. AO to the A1 (Analog Pin) of the Quarky Expansion Board

Getting Sensor Reading

The following script displays the real-time sensor reading. The reading will vary from 0 to 4095.

# Create a Sprite object named 'Tobi'
sprite = Sprite('Tobi')

# Create a Quarky object
quarky = Quarky()

# Create an IoTHouse object
house = IoTHouse()

# Create an infinite loop
while True:
  # Have the Sprite say the LDR value of 'A1' in the IoTHouse
  sprite.say("Sensor Reading - " + str(house.ldrvalue("A1")))

 

Connect Quarky and you will start getting the readings.

You will notice from the reading that when you put your hands close to the sensor the value is higher. This is because the sensor receives less light and the voltage across the sensor increases. We will create a threshold to identify when the light is low.  For the above readings, it will be 2800. Now if the sensor value is greater than 2800, then it is active, meaning that the light is low. We will use it in the next code.

Automatic Light

  1. In this code, we are creating two objects: one called quarky and one called house.
  2. We are setting the brightness of quarky to 15 and the light threshold for a house to 2800.
  3. We have an infinite loop that checks the light status at sensor A1.
  4. If the light status at sensor A1 is true, it will light the Quarky Display with White Light. If the light status at sensor A1 is false, it will clear the display on the Quarky object.

 

Uploading Code

 

Click on the Upload Code button.

# This code uses two different libraries: 'quarky' and 'iothouse'. 
# 'iothouse' is used to detect the light intensity in a room, and 'quarky' is used to control LEDs.

# First, we set the brightness of the LEDs to 15.
from quarky import *
quarky.setbrightness(15)

# Then, we set the light level threshold to 2800.
import iothouse
house = iothouse.iothouse()
house.setldrthreshold(2800)

# Now, the program will check the light level continuously and take action based on the result.
while True:
  # If the light level is below the threshold, the program will draw a pattern on the LEDs.
  if house.ldrstatus("A1"):
    quarky.drawpattern("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa")

  # If the light level is above the threshold, the program will clear the LEDs.
  else:
    quarky.cleardisplay()
Read More
This example shows you how to use the flame sensor to detect heat or flame nearby, and then how to create an alarm system that is triggered by the flame sensor.

This example demonstrates how to set up the flame sensor with Quarky to detect heat or flame nearby. Later, we create an alarm system triggered with the flame sensor.

Flame Sensor Connection to Quarky

Flame sensors have 4 pins: GND, VCC, DO, and AO. You have to connect the following 3 pins to the Quarky Expansion Board:

  1. GND to Ground Pin of Quarky Expansion Board
  2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
  3. DO to the D3 (Digital Pin) of the Quarky Expansion Board

Calibrating Flame Sensor

The sensor also has 2 LEDs – Power and Detection. The desired calibration is achieved when the sensor is inactive when there is no heat or flame nearby and active when the flame is nearby. It is visible on the detection LED.

To calibrate the flame sensor:

  1. Turn the power on for the sensor.
  2. Place the sensor close to the heat or flame. You should see the detection LED turn on.
  3. If the LED is Off, adjust the potentiometer until the detection LED turns on.
  4. Move the sensor away from the heat or flame. The detection LED should turn off.
  5. If the detection LED does not turn off, continue to adjust the potentiometer until it does.

Project: Flame-based Alarm System

In the project, when heat or flame is detected, the alarm system starts with

  1. The fan turned ON.
  2. Quakry beeping with lights.
  3. The door is open for urgent evacuation.

The alarm system will be on until the flame sensor stops detecting the fire.

Circuit

Connect the following modules to the Quarky Expansion Board:

  1. Flame Sensor
    1. GND to Ground Pin of Quarky Expansion Board
    2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
    3. DO to the D3 (Digital Pin) of the Quarky Expansion Board
  2. Motor Fan: Connect the motor to the Motor Port 1 of the Quarky Expansion Board.
  3. Door Servo Motor: Connect the servo motor to the servo port 5 of the Quarky Expansion Board.

Code

Adding IoT in Fire Alarm System

As an advanced system, we can also send the fire detection alert to the users using IFTTT. For that, we will use IFTTT webhooks.

The following IFTTT sequence is to be created:

You can learn in detail how to create an IFTTT applet here: https://ai.thestempedia.com/extension/ifttt-webhooks/

Code

You can download the code from here: Flame-Based Alarm System – Stage Mode

IoT-based Fire Alarm in Upload Mode

You can also make the automatic lighting work independently of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

Note: Make sure you know how to work on Upload Mode with Quarky in IoT. Follow the example to learn more: https://ai.thestempedia.com/example/automatic-light-control-with-ldr-sensor/

You can download the code from here: Flame-Based Alarm System – Upload Mode

Read More
Learn how to set up a flame sensor with Quarky, calibrate it, and create a fire alarm system with a motor fan, door servo motor, and IFTTT webhooks. Get step-by-step instructions, circuit diagrams, python code, and more.

This example demonstrates how to set up the flame sensor with Quarky to detect heat or flame nearby. Later, we create an alarm system triggered with the flame sensor.

Flame Sensor Connection to Quarky

Flame sensors have 4 pins: GND, VCC, DO, and AO. You have to connect the following 3 pins to the Quarky Expansion Board:

  1. GND to Ground Pin of Quarky Expansion Board
  2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
  3. DO to the D3 (Digital Pin) of the Quarky Expansion Board

Calibrating Flame Sensor

The sensor also has 2 LEDs – Power and Detection. The desired calibration is achieved when the sensor is inactive when there is no heat or flame nearby and active when the flame is nearby. It is visible on the detection LED.

To calibrate the flame sensor:

  1. Turn the power on for the sensor.
  2. Place the sensor close to the heat or flame. You should see the detection LED turn on.
  3. If the LED is Off, adjust the potentiometer until the detection LED turns on.
  4. Move the sensor away from the heat or flame. The detection LED should turn off.
  5. If the detection LED does not turn off, continue to adjust the potentiometer until it does.

Project: Flame-based Alarm System

In the project, when heat or flame is detected, the alarm system starts with

  1. The fan turned ON.
  2. Quakry beeping with lights.
  3. The door is open for urgent evacuation.

The alarm system will be on until the flame sensor stops detecting the fire.

Circuit

Connect the following modules to the Quarky Expansion Board:

  1. Flame Sensor
    1. GND to Ground Pin of Quarky Expansion Board
    2. VCC to 3.3V or VCC Pin of Quarky Expansion Board
    3. DO to the D3 (Digital Pin) of the Quarky Expansion Board
  2. Motor Fan: Connect the motor to the Motor Port 1 of the Quarky Expansion Board.
  3. Door Servo Motor: Connect the servo motor to the servo port 5 of the Quarky Expansion Board.

Python Code

  1. The code first creates objects of the Quarky, Expansion, and IoTHouse classes.
  2. It then moves the servo connected to pin 5 to 100 degrees and stops the motor connected to pin 1.
  3. Then it defines two functions, fireDetectedSequence() and fireStopSequence().
    1. When the fire is detected, it moves the servo connected to pin 5 to 0 degrees, runs the motor connected to pin 1 in clockwise direction with speed 100, plays a tone at C4 pitch for 8 beats, and draws a red pattern on the Quarky display.
    2. When the fire is no longer detected, it stops the motor connected to pin 1, clears the display of the Quarky robot, and moves the servo connected to pin 5 to 100 degrees.
  4. In the while loop, the code keeps on checking the flame status of pin D3 in the house.
    1. If the flame is detected, it waits for two seconds and again checks the flame status. If the flame is still detected, it runs the fireDetectedSequence() and then runs the fireStopSequence().
# The following code is written to detect a fire in a house using a Quarky robot and Expansion board.
# quarky is an instance of the Quarky class which has functionalities like playing a tone and drawing a pattern on LED Screen
quarky = Quarky()
# quarkyexpansion is an instance of the Expansion class which has functionalities like moving a servo and running a motor
quarkyexpansion = Expansion()
# house is an instance of the IoTHouse class which has functionalities like checking the flame status
house = IoTHouse()
# import time library which has functionalities like sleeping for a certain amount of time
import time

# move the servo connected to pin 5 to 100 degrees
quarkyexpansion.moveservo(5, 100)
# stop the motor connected to pin 1
quarkyexpansion.stopmotor(1)


# define a function which initiate instructions when the flame is detected
def fireDetectedSequence():
  # move the servo connected to pin 5 to 0 degrees
  quarkyexpansion.moveservo(5, 0)
  # run the motor connected to pin 1 in clockwise direction with speed 100
  quarkyexpansion.runmotor(1, 1, 100)
  # keep on checking the flame status of pin D3 in the house, until it is no longer in flame
  while not (house.flamestatus("D3")):
    # clear the display of the Quarky robot
    quarky.cleardisplay()
    # play a tone at C4 pitch for 8 beats
    quarky.playtone("C4", 8)
    # draw red pattern on the Quarky display
    quarky.drawpattern("bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb")
    time.sleep(0.7)


# define a function which is initiated when no flame is detected
def fireStopSequence():
  # move the servo connected to pin 5 to 100 degrees
  quarkyexpansion.moveservo(5, 100)
  # stop the motor connected to pin 1
  quarkyexpansion.stopmotor(1)
  # clear the display of the Quarky robot
  quarky.cleardisplay()


while True:
  # keep on checking the flame status of pin D3 in the house
  if not (house.flamestatus("D3")):
    time.sleep(2)
    # again check the flame status of pin D3 in the house
    if not (house.flamestatus("D3")):
      # if flame is detected, run the fireDetectedSequence()
      fireDetectedSequence()
      # and then run the fireStopSequence()
      fireStopSequence()

Adding IoT in Fire Alarm System

As an advanced system, we can also send the fire detection alert to the users using IFTTT. For that, we will use IFTTT webhooks.

The following IFTTT sequence is to be created:

You can learn in detail how to create an IFTTT applet here: https://ai.thestempedia.com/extension/ifttt-webhooks/

Code

This code is the continuation of the past code:

  1. The code will continuously check the flame status of pin D3 in the house.
  2. If the flame is detected, it will initiate the fireDetectedSequence() which will send an event to IFTTT Webhooks, move the servo connected to pin 5 to 0 degrees, run the motor connected to pin 1 in the clockwise direction with speed 100, clear the display of the Quarky robot, and play a tone at C4 pitch for 8 beats.
  3. Once the flame status is no longer detected, it will initiate the fireStopSequence() which will send an event to IFTTT Webhooks, move the servo connected to pin 5 to 100 degrees, stop the motor connected to pin 1, and clear the display of the Quarky robot.
# The following code is written to detect a fire in a house using a Quarky robot and Expansion board.
# quarky is an instance of the Quarky class which has functionalities like playing a tone and drawing a pattern on LED Screen
quarky = Quarky()
# quarkyexpansion is an instance of the Expansion class which has functionalities like moving a servo and running a motor
quarkyexpansion = Expansion()
# house is an instance of the IoTHouse class which has functionalities like checking the flame status
house = IoTHouse()
# import time library which has functionalities like sleeping for a certain amount of time
import time
#Create an instance of the IFTTTWebhooks library
ifttt = IFTTTWebhooks()

# move the servo connected to pin 5 to 100 degrees
quarkyexpansion.moveservo(5, 100)
# stop the motor connected to pin 1
quarkyexpansion.stopmotor(1)

#Set the webhook key and event name
ifttt.setifttt("Flame_Detected", "iNyFg77wDLYV-V9UtdXVtmeebiOw_72LjxZud084ybr")

# define a function which initiate instructions when the flame is detected
def fireDetectedSequence():
  #Set the message and priority
  ifttt.setvalues("Fire Started! Evacuation Started", 1)
  ifttt.triggerevent() #Send the event
  # move the servo connected to pin 5 to 0 degrees
  quarkyexpansion.moveservo(5, 0)
  # run the motor connected to pin 1 in clockwise direction with speed 100
  quarkyexpansion.runmotor(1, 1, 100)
  # keep on checking the flame status of pin D3 in the house, until it is no longer in flame
  while not (house.flamestatus("D3")):
    # clear the display of the Quarky robot
    quarky.cleardisplay()
    # play a tone at C4 pitch for 8 beats
    quarky.playtone("C4", 8)
    # draw red pattern on the Quarky display
    quarky.drawpattern("bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb")
    time.sleep(0.7)


# define a function which is initiated when no flame is detected
def fireStopSequence():
  #Set the message and priority
  ifttt.setvalues("Fire Stopped", 1)
  ifttt.triggerevent() #Send the event
  # move the servo connected to pin 5 to 100 degrees
  quarkyexpansion.moveservo(5, 100)
  # stop the motor connected to pin 1
  quarkyexpansion.stopmotor(1)
  # clear the display of the Quarky robot
  quarky.cleardisplay()


while True:
  # keep on checking the flame status of pin D3 in the house
  if not (house.flamestatus("D3")):
    time.sleep(2)
    # again check the flame status of pin D3 in the house
    if not (house.flamestatus("D3")):
      # if flame is detected, run the fireDetectedSequence()
      fireDetectedSequence()
      # and then run the fireStopSequence()
      fireStopSequence()
Read More
The example shows how to use a number classifier in PictoBlox to make the customer spending money classifier bot.

Introduction

In this example project, we are going to create a Machine Learning Model that can predict the amount of money a customer will spend based on the input details added by the user.

Numbers(C/R) in Machine Learning Environment

Datasets on the internet are hardly ever fit to directly train on. Programmers often have to take care of unnecessary columns, text data, target columns, correlations, etc. Thankfully, PictoBlox’s ML Environment is packed with features to help us pre-process the data as per our liking.

Let’s create the ML model.

Opening Image Classifier Workflow

Alert: The Machine Learning Environment for model creation is available in the only desktop version of PictoBlox for Windows, macOS, or Linux. It is not available in Web, Android, and iOS versions.

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as Block Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. You’ll be greeted with the following screen.
    Click on “Create New Project“.
  5. You shall see the Numbers C/R workflow with an option to either “Upload Dataset” or “Create Dataset”.

    Uploading/Creating Dataset

    Datasets can either be uploaded or created on the ML Environment. Lets see how it is done.

    Uploading a dataset
    1. To upload a dataset, click on the Upload Dataset button and the Choose CSV from your files button.
      Note: An uploaded dataset must be a “.csv” file.
    2. Once uploaded the first 50 rows of the uploaded CSV document will show up in the window.
    Creating a Dataset
    1. To create a dataset, click on the Create Dataset button.
    2. Select the number of rows and columns that are to be added and click on the Create button. More rows and columns can be added as and when needed.

    Notes:

    1. Each column represents a feature. These are the values used by the model to train itself.
    2. The “Output” column contains the target values. These are the values that we expect the model to return when features are passed.
    3. The window only shows the first 50 rows of the dataset.
    4. Un-check the “Select All” checkbox to un-select all the columns.

    Training the Model

    After data is pre-processed and optimized, it’s fit to be used in model training. To train the model, simply click the “Train Model” button found in the “Training” panel.

    By training the model, meaningful information is extracted from the numbers, and that in turn updates the weights. Once these weights are saved, the model can be used to make predictions on data previously unseen.

    The model’s function is to use the input data and predict the output. The target column must always contain numbers.

    However, before training the model, there are a few hyperparameters that need to be understood. Click on the “Advanced” tab to view them.

    There are three hyperparameters that can be altered in the Numbers(C/R) Extension:

    1. Epochs– The total number of times the data will be fed through the training model. Therefore, in 10 epochs, the dataset will be fed through the training model 10 times. Increasing the number of epochs can often lead to better performance.
    2. Batch Size– The size of the set of samples that will be used in one step. For example, if there are 160 data samples in the dataset, and the batch size is set to 16, each epoch will be completed in 160/16=10 steps. This hyperparameter rarely needs any altering.
    3. Learning Rate– It dictates the speed at which the model updates the weights after iterating through a step. Even small changes in this parameter can have a huge impact on the model performance. The usual range lies between 0.001 and 0.0001.
    Note: Hover the mouse pointer over the question mark next to the hyperparameters to see their description.

    It’s a good idea to train a numeric classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

    Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

    The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch.

    A window will open. Type in a project name of your choice and select the “Numbers(C/R)” extension. Click the “Create Project” button to open the Numbers(C/R) window.

    Testing the Model

    To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

    The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Script

  1. Select the “Tobi” sprite.
  2. We’ll start by adding a when flag clicked” block from the Events palette.
  3. Add an “ask () and wait” block from the Sensing palette. Write an appropriate statement in an empty place.
  4. Add the “if () then” block from the control palette for checking the user’s input.
  5. In the empty place of the “if () then” block, add an “()=()” block from the Operator palette. At the first empty place select an “answer” block from the Sensing palette and for the second place write an appropriate statement in an empty place.
  6. Add the “set gender as ()” block from the Machine Learning palette. Select the Male option at the empty place.
  7. Repeat “if () then” block code for other variables, make appropriate changes in copying block code according to other variables, and add code just below it.
  8. Add an “ask () and wait” block from the Sensing palette. Write an appropriate statement in an empty place.
  9. Add the “if () then” block from the control palette for checking the user’s input.
  10. In the empty place of the “if () then” block, add an “()=()” block from the Operator palette. At the first empty place select an “answer” block from the Sensing palette and for the second place write an appropriate statement in an empty place.
  11. Add the “set education as ()” block from the Machine Learning palette. Select the High School option at the empty place.
  12. Repeat “if () then” block code for other variables, make appropriate changes in copying block code according to other variables, and add code
  13. Add an “ask () and wait” block from the Sensing palette. Write an appropriate statement in an empty place.
  14. Add the “set () as ()” block from the Machine Learning palette. Select the age option at the first empty place, and for the second select an “answer” block from the Sensing palette.
  15. Repeat “ask () and wait” block code for other variables, make appropriate changes in copying block code according to other variables, and add code.
  16. Repeat “ask () and wait” block code for other variables, make appropriate changes in copying block code according to other variables, and add code.
  17. Add the “say ()” block from the Looks palette.
  18. Add an “join () ()” block from the Operator palette. Write an appropriate statement in an first empty place and at second empty place add “analyse numbers” block from the Machine Learning palette. 

    Final Result

Read More
This project demonstrates how to create a voice-controlled smart plug using natural language processing (NLP), speech recognition, and a relay.

This project demonstrates how to create a voicecontrolled smart plug using natural language processing (NLP), speech recognition, and a relay.

Text Classifier in PictoBlox

We will use the PictoBlox Machine Learning environment for creating the text classifier.

Follow the steps to create the model:

  1. Open PictoBlox and the Machine Learning Environment.
  2. Click on Open Project and add import the following project file: Alexa
  3. Find the Alexa project in the list and open it to get the workspace of the text classifier.
  4. There will be 2 classes Lights On and Lights Off. Each class has some phases added corresponding to the action. You can add a few phases yourself.
  5. Click on Train Model to train the model.
  6. Once trained, you can check the model in the Testing tab. Add a phase and check if it provides the right sentiment.
    Note: If the model is not giving the desired results, then add more phases until the model is good.
  7. Click on the Export Model to create the blocks for the project.
  8. Test the block with some input text.

Circuit

The bulb is connected to the smart plug which is controlled with a relay.

Note:  A relay is an electromechanical switch that is used to turn on or turn off a circuit by using a small amount of power to operate an electromagnet to open or close a switch.

If the relay is ON, the smart switch gets ON, turning on the light. The relay has the following connections:

  1. GND Pin connected to GND of the Quarky Expansion Board.
  2. VCC Pin connected to VCC of the Quarky Expansion Board.
  3. Signal Pin connected to Servo 4 of the Quarky Expansion Board.

Code

The logic is the following – The Speech Recognition extension converts speech to text, which is then fed into the text classifier block. The text classifier block provides sentiment information, which is used to control the light via the relay.

Output

Read More
Learn how to create a dataset and Machine Learning Model for an automated game from the user's input. See how to open the ML environment, upload data, label images, train the model, and export the Python script.

Introduction

In this example project, we are going to create a Machine Learning Model where fish automatically feed on randomly generated food.

Data Collection

  • Now, we are going to collect the data of  automated fish feast Game .
  • This data will contain the actions that we have taken to accomplish the game successfully.
  • We will use the data we collect here, to teach our device how to play the automated fish feast game , i.e. to perform machine learning.
  • The data that you collect will get saved in your device as a csv (comma separated values) file. If you open this file in Microsoft Excel, it will look as shown below:

Follow the steps below

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as Python Coding Environment.
  3. Now write code in python.

Code for making dataset

  1. Creates a sprite object named “Fish”. A sprite is typically a graphical element that can be animated or displayed on a screen.
  2. Creates another sprite object named “Orange” and also upload backdrop of “Underwater2” .
  3. Click on the Fish.py file from the Project files section.
    sprite = Sprite('Fish')
  4. Similarly, declare new sprite on the Fish.py file.
    sprite1 = Sprite('Orange')
  5. Then we will import the time, random, os, math, TensorFlow as tf  and Pandas as pd modules using the import keyword for using delay in the program later.
    1. Time – For using delay in the program.
    2. Random – For using random position.
    3. Pandas as pd – For using Data Frame.
    4. Math– For using math functions in code.
    5. Os– For reading files from Program files.
      import random
      import time
      import tensorflow as tf
      import pandas as pd
      import os
      import math
  6. Now, make 3 variables curr_x, curr_y, ang_f, mov_f and score with initial values 4, 3, 50, and 0 respectively.
    1. curr_x – To store the initial x – position of fish.
    2. curr_y – To store the initial y – position of fish.
    3. ang_f – To store increment value in angle of fish on pressing specific key.
    4. mov_f – To store increment value in movement of fish on pressing specific key.
    5. angle – To store initial angle of fish.
    6. score – To store the score while playing the game.
      curr_x = 4
      curr_y = 3
      ang_f= 10
      mov_f= 5
      score = 0
      angle = 90
  7. Now set initial position and angle of fish.
    sprite.setx(curr_x)
    sprite.sety(curr_y)
    sprite.setdirection(DIRECTION=90)
  8. Now, make a function settarget() in which we are generating food at a random position. We pass one argument “t” in the function for generating target food in the greed position of the t gap.
    1. x and y – To generate the food at random position on stage.
    2. time.sleep – For giving the time delay.
    3. sprite1.set()– To Set the position of food at random position on stage.
      def settarget(t):
      x = random.randrange(-200, 200, t)
      y = random.randrange(-155, 155, t)
      time.sleep(0.1)
      sprite1.setx(x)
      sprite1.sety(y)
      return x, y 
  9. Now set the target (food). In this, fish are chasing the food, and target_x  and  target_y should be equal to the x and y positions of the food.
    target_x, target_y = settarget(40) 
  10. Now create a data frame of name “Chase_Data.csv” to collect the data for machine learning and if this name csv exist then directly add data in it.
    if(os.path.isfile('Chase_Data.csv')):
      data=pd.read_csv('Chase_Data.csv')
    else:
      data = pd.DataFrame({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "RIGHT"}, index=[0])
    
  11. After that, we will use the while True loop to run the code indefinitely. Don’t forget to add a colon ‘:’ just after the loop to avoid errors.
    while True:
  12. Now write the script for moving the Fish in forward direction and change clockwise or anticlockwise direction by fix value with the help of a conditional statement.
    1. If the up arrow key is pressed then fish will move mov_f position in same direction.
    2. After pressing the up arrow key action taken should be stored in the Data frame with data.append command.
      if sprite.iskeypressed("up arrow"):
          data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "UP"}, ignore_index=True)
          sprite.move(mov_f)
  13. Repeat the process for the set direction in clockwise or anticlockwise.
    if sprite.iskeypressed("left arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "LEFT"}, ignore_index=True)
        angle = angle - ang_f
        sprite.setdirection(DIRECTION=angle)
     if sprite.iskeypressed("right arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "RIGHT"}, ignore_index=True)
        angle = angle + ang_f
        sprite.setdirection(DIRECTION=angle)
  14. Write the conditional statement for the storing data in csv file after few score.
    if(score>0 and score%2==0):
        data.to_csv('Chase_Data.csv',index=False)
  15. Again write the conditional statement for the score variable if the fish and food position difference is less then 20 then the score should be increased by one.
    if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20: 
        score = score + 1 
        sprite.say(("your score is: {}".format(score)))
  16. If the score is equal to or greater than 40 then data should be printed on Chase Data.csv file.
    if (score >= 40):
          print(data)
          data.to_csv('Chase Data.csv')
          break
        target_x, target_y = settargetLED()
  17. Now update the curr_x and curr_y variables by storing the current position of the fish and delaying movement by 0.02 seconds.
    curr_x=math.floor(sprite.x())
    curr_y=math.floor(sprite.y())
    time.sleep(0.02)
  18. The final code is as follows:
    sprite = Sprite('Fish')
    sprite1 = Sprite('Orange')
     
    import random
    import time
    import tensorflow as tf
    import pandas as pd
    import os
    import math
    		
    curr_x = -170
    curr_y = 138
    score=0
    ang_f=10
    mov_f=5
    angle=90
    sprite.say(("your score is: {}".format(score)))
    sprite.setx(curr_x)
    sprite.sety(curr_y)
    sprite.setdirection(DIRECTION=90)
    
    def settarget(t):
      x = random.randrange(-200, 200, t)
      y = random.randrange(-155, 155, t)
      time.sleep(0.1)
      sprite1.setx(x)
      sprite1.sety(y)
      return x, y
    
    target_x, target_y = settarget(40)
    
    if(os.path.isfile('Chase_Data.csv')):
      data=pd.read_csv('Chase_Data.csv')
    else:
      data = pd.DataFrame({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "RIGHT"}, index=[0])
    
    while True:
      angle=sprite.direction()
      if sprite.iskeypressed("up arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "UP"}, ignore_index=True)
        sprite.move(mov_f)
    
         
      if sprite.iskeypressed("left arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "LEFT"}, ignore_index=True)
        angle = angle - ang_f
        sprite.setdirection(DIRECTION=angle)
        
         
      if sprite.iskeypressed("right arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "direction": angle, "Action": "RIGHT"}, ignore_index=True)
        angle = angle + ang_f
        sprite.setdirection(DIRECTION=angle)
        
      if(score>0 and score%2==0):
        data.to_csv('Chase_Data.csv',index=False)
      
      if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20:
        score = score + 1
        sprite.say(("your score is: {}".format(score)))
        if (score >= 40):
          data.to_csv('Chase_Data.csv',index=False)
          break
        target_x, target_y = settarget(40)
      curr_x=math.floor(sprite.x())
      curr_y=math.floor(sprite.y())
      time.sleep(0.02)
     
    
  19. Press the Run button and play fish feast game to collect data.
  20. Store this dataset on your local computer.

    Numbers(C/R) in Machine Learning Environment

    Datasets on the internet are hardly ever fit to directly train on. Programmers often have to take care of unnecessary columns, text data, target columns, correlations, etc. Thankfully, PictoBlox’s ML Environment is packed with features to help us pre-process the data as per our liking.

    Let’s create the ML model.

    Opening Image Classifier Workflow

    Alert: The Machine Learning Environment for model creation is available in the only desktop version of PictoBlox for Windows, macOS, or Linux. It is not available in Web, Android, and iOS versions.

    Follow the steps below:

    1. Open PictoBlox and create a new file.
    2. Select the coding environment as Block Coding Environment.
    3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
    4. You’ll be greeted with the following screen.
      Click on “Create New Project“.
    5. You shall see the Numbers C/R workflow with an option to either “Upload Dataset” or “Create Dataset”.

      Uploading/Creating Dataset

      Datasets can either be uploaded or created on the ML Environment. Lets see how it is done.

      Uploading a dataset
      1. To upload a dataset, click on the Upload Dataset button and the Choose CSV from your files button.
        Note: An uploaded dataset must be a “.csv” file.
      2. Once uploaded the first 50 rows of the uploaded CSV document will show up in the window.

      3. If you look at the output column, all the values are currently “0”. Hence, first we need to create an output column.
        1. In the Dataset table, click on the tick near Select All to de-select all the columns.
        2. click on the tick of Action column to select it. We will make this column the output.
        3. The output column must always be numerical. Hence click on the button Text to Number to convert the data within this column to numerical type.
        4. Now select it again and press the Set as Output button to set this column as Output.
        5. There is also many which is not useful in training our model and needs to be disable. So select it and click the Disable button in the Selected columns section.

          Creating a Dataset
          1. To create a dataset, click on the Create Dataset button.
          2. Select the number of rows and columns that are to be added and click on the Create button. More rows and columns can be added as and when needed.

          Notes:

          1. Each column represents a feature. These are the values used by the model to train itself.
          2. The “Output” column contains the target values. These are the values that we expect the model to return when features are passed.
          3. The window only shows the first 50 rows of the dataset.
          4. Un-check the “Select All” checkbox to un-select all the columns.

          Training the Model

          After data is pre-processed and optimized, it’s fit to be used in model training. To train the model, simply click the “Train Model” button found in the “Training” panel.

          By training the model, meaningful information is extracted from the numbers, and that in turn updates the weights. Once these weights are saved, the model can be used to make predictions on data previously unseen.

          The model’s function is to use the input data and predict the output. The target column must always contain numbers.

          However, before training the model, there are a few hyperparameters that need to be understood. Click on the “Advanced” tab to view them.

          There are three hyperparameters that can be altered in the Numbers(C/R) Extension:

          1. Epochs– The total number of times the data will be fed through the training model. Therefore, in 10 epochs, the dataset will be fed through the training model 10 times. Increasing the number of epochs can often lead to better performance.
          2. Batch Size– The size of the set of samples that will be used in one step. For example, if there are 160 data samples in the dataset, and the batch size is set to 16, each epoch will be completed in 160/16=10 steps. This hyperparameter rarely needs any altering.
          3. Learning Rate– It dictates the speed at which the model updates the weights after iterating through a step. Even small changes in this parameter can have a huge impact on the model performance. The usual range lies between 0.001 and 0.0001.
          Note: Hover the mouse pointer over the question mark next to the hyperparameters to see their description.

          It’s a good idea to train a numeric classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

          Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

          The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch.

          A window will open. Type in a project name of your choice and select the “Numbers(C/R)” extension. Click the “Create Project” button to open the Numbers(C/R) window.

          Testing the Model

          To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

          The model will return the probability of the input belonging to the classes.

      Export in Python Coding

      Click on the “PictoBlox” button, and PictoBlox will load your model into the Python Coding Environment if you have opened the ML Environment in Python Coding.

    Code

    1. Creates a sprite object named “Fish”. A sprite is typically a graphical element that can be animated or displayed on a screen.
    2. Creates another sprite object named “Orange” and also upload backdrop of “Underwater2” .
    3. Click on the Fish.py file from the Project files section.
      sprite = Sprite('Fish')
    4. Similarly, declare new sprite on the Fish.py file.
      sprite1 = Sprite('Orange')
    5. Then we will import the time, random, os, math, TensorFlow as tf  and Pandas as pd modules using the import keyword for using delay in the program later.
      1. Time – For using delay in the program.
      2. Random – For using random position.
      3. Pandas as pd – For using Data Frame.
      4. Math– For using math functions in code.
      5. Os– For reading files from Program files.
        import random
        import time
        import tensorflow as tf
        import pandas as pd
        import os
        import math
    6. Now, make 3 variables curr_x, curr_y, ang_f, mov_f,angle and score with initial values -170, 138, 30, 15, 90 and 0 respectively.
      1. curr_x – To store the initial x – position of fish.
      2. curr_y – To store the initial y – position of fish.
      3. ang_f – To store increment value in angle of fish on pressing specific key.
      4. mov_f – To store increment value in movement of fish on pressing specific key.
      5. angle – To store initial angle of fish.
      6. score – To store the score while playing the game.
        curr_x = -170
        curr_y = 138
        ang_f= 30
        mov_f= 15
        score = 0
        angle = 90
    7. Now set initial position and angle of fish.
      sprite.setx(curr_x)
      sprite.sety(curr_y)
      sprite.setdirection(DIRECTION=90)
    8. Now, make a function settarget() in which we are generating food at a random position. We pass one argument “t” in the function for generating target food in the greed position of the t gap.
      1. x and y – To generate the food at random position on stage.
      2. time.sleep – For giving the time delay.
      3. sprite1.set()– To Set the position of food at random position on stage.
        def settarget(t):
        x = random.randrange(-200, 200, t)
        y = random.randrange(-155, 155, t)
        time.sleep(0.1)
        sprite1.setx(x)
        sprite1.sety(y)
        return x, y 
    9. Now set the position of food. In this, fish are chasing the food, and target_x  and  target_y should be equal to the x and y positions of the food.
      target_x, target_y = settarget(40) 
    10. Now, make a function runprediction() in which we are predicting class (Left, Up, right) by taking argument from user . We pass three arguments “diff_x”, “diff-y”, “ang” in the function.
      1. inputvalue – To store input parameters of function in array.
      2. model.predict() – For predicting output from trained model.
      3. np.argmax(,)– To find the most probable prediction output.
        def runprediction(diff_x, diff_y, ang):
          inputValue=[diff_x, diff_y, ang]
          #Input Tensor
          inputTensor = tf.expand_dims(inputValue, 0)
          #Predict
          predict = model.predict(inputTensor)
          predict_index = np.argmax(predict[0], axis=0)
          #Output
          predicted_class = class_list[predict_index]
          return predicted_class
    11. After that, we will use the while True loop to run the code indefinitely. Don’t forget to add a colon ‘:’ just after the loop to avoid errors.
      while True:
    12. In while loop find angle of sprite and call runprediction function by passing arguments in it.
       angle=sprite.direction()
        move = runprediction(curr_x- target_x, curr_y-target_y, angle)
    13. Now write the script for moving the Fish in forward direction and change clockwise or anticlockwise direction by fix value with the help of a conditional statement.
      1. If the predicted value is “UP” then fish will move mov_f position in same direction.
      2. If the predicted value is “LEFT” then fish will change direction by some constant value in anticlockwise direction.
      3. If the predicted value is “RIGHT” then fish will change direction by some constant value in clockwise direction.
        if move == "UP":
            sprite.move(mov_f)
            curr_x=sprite.x()
            curr_y=sprite.y()
          if move == "LEFT":
            angle = angle - ang_f
            sprite.setdirection(DIRECTION=angle)
          if move == "RIGHT":
            angle = angle + ang_f
            sprite.setdirection(DIRECTION=angle)
    14.  Again write the conditional statement for the score variable if the fish and food position difference is less then 20 then the score should be increased by one also set new position of target.
      if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20: 
          score = score + 1 
          sprite.say(("your score is: {}".format(score)))
      target_x, target_y = settarget()
    15.  Now add delay function for delaying movement by 0.02 seconds.
      time.sleep(0.02)
    16. The final code is as follows
      sprite = Sprite('Fish')
      sprite1 = Sprite('Orange')
       
      import random
      import time
      import numpy as np
      import tensorflow as tf
      import pandas as pd
      import os
      import math
      
      #Load Number Model
      model= tf.keras.models.load_model(
      		"num_model.h5", 
      		custom_objects=None, 
      		compile=True, 
      		options=None)
      		
      #List of classes
      class_list = ['UP','LEFT','RIGHT',]  
      		
      curr_x = -170
      curr_y = 138
      score=0
      ang_f=30
      mov_f=15
      angle=90
      
      sprite.say(("your score is: {}".format(score)))
      sprite.setx(-170)
      sprite.sety(138)
      sprite.setdirection(DIRECTION=90)
      
      def settarget():
        x = random.randrange(-200, 200, 1)
        y = random.randrange(-155, 155, 1)
        time.sleep(0.1)
        sprite1.setx(x)
        sprite1.sety(y)
        return x, y
      
      target_x, target_y = settarget()
      
      def runprediction(diff_x, diff_y, ang):
        inputValue=[diff_x, diff_y, ang]
        #Input Tensor
        inputTensor = tf.expand_dims(inputValue, 0)
        #Predict
        predict = model.predict(inputTensor)
        predict_index = np.argmax(predict[0], axis=0)
        #Output
        predicted_class = class_list[predict_index]
        return predicted_class
      
      while True:
        angle=sprite.direction()
        move = runprediction(curr_x- target_x, curr_y-target_y, angle)
      
        if move == "UP":
          sprite.move(mov_f)
          curr_x=sprite.x()
          curr_y=sprite.y()
      
        if move == "LEFT":
          angle = angle - ang_f
          sprite.setdirection(DIRECTION=angle)
      
        if move == "RIGHT":
          angle = angle + ang_f
          sprite.setdirection(DIRECTION=angle)
       
      
        if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20:
          score = score + 1
          sprite.say(("your score is: {}".format(score)))
          target_x, target_y = settarget()
      
        time.sleep(0.2) 
      

      Final Result

      Conclusion

      Creating a Machine Learning Model of automated fish feast game can be both complex and time-consuming. Through the steps demonstrated in this project, you can create your own Machine Learning Model of automated game. Once trained, you can export the model into the Python Coding Environment, where you can tweak it further to give you the desired output. Try creating a Machine Learning Model of your own today and explore the possibilities of Number Classifier in PictoBlox!

Read More
In this example, we will look into the basics of a Drip Irrigation System using the IoT House. With IoT House, you get 2 plant drip irrigation raw equipment.

In this example, we will look into the basics of a Drip Irrigation System using the IoT House. With IoT House, you get 2 plant drip irrigation raw equipment.

Drip Irrigation

The drip irrigation system is a type of micro-irrigation of soil that has the potential to save water and nutrients by allowing water to drip slowly to the roots of plants. This leads to the wise use of water and also makes water reach the roots of plants and minimize evaporation. Drip irrigation systems distribute water through a network of valves, pipes, tubing, and emitters. Its efficiency depends on how well the system is designed, installed, maintained, and operated. A simple drip irrigation system is shown below in the figure.

Drip Irrigation Kit

In this project, we will be using an IoT House Drip irrigation kit. This kit consists of various components out of which the following are required in our project

  1. A feeder line pipe is used to transfer water from the water tank source to the drip irrigation system. It is also used to connect the T-connectors to make different connections.
  2. A drip emitter is a small piece used to water the plant drop by drop. It is connected to the T connector with the help of a feeder line pipe of around(2.5 cm as per our requirement).
  3. A T connector is a T-shaped piece of plastic that connects one drip emitter to the adjacent drip emitter with the help of a feeder line pipe.
  4. Emitter stakes are stands with a curve at the top and pointed at another end. It will help to provide stability to the feeder line pipe drip emitter connections.
  5. A 12V,3W Water pump is connected at one end of the feeder pipeline. The male end of the water pump is connected female dc connector.

Circuit

The Water Pump Connected to the Relay: The water pump is controlled by the smart switch of the IoT house which has a relay controlling the state. If the relay is ON, the smart switch gets ON, turning on the water pump. The relay has the following connections:

  1. GND Pin connected to GND of the Quarky Expansion Board.
  2. VCC Pin connected to VCC of the Quarky Expansion Board.
  3. Signal Pin connected to Servo 4 of the Quarky Expansion Board.

Script

The pump will turn ON when the L button of the Quarky is pressed.

Output

Read More
Learn how to assemble a Drip Irrigation System using the IoT House. This tutorial covers the assembly, circuit, and Python code for 1 or 2 plants. Also, find out how to calibrate the flow of water with the loosening and tightening of the emitter.

In this example, we will look into the basics of a Drip Irrigation System using the IoT House. With IoT House, you get 2 plant drip irrigation raw equipment.

Drip Irrigation

The drip irrigation system is a type of micro-irrigation of soil that has the potential to save water and nutrients by allowing water to drip slowly to the roots of plants. This leads to the wise use of water and also makes water reach the roots of plants and minimize evaporation. Drip irrigation systems distribute water through a network of valves, pipes, tubing, and emitters. Its efficiency depends on how well the system is designed, installed, maintained, and operated. A simple drip irrigation system is shown below in the figure.

Drip Irrigation Assembly

The following tutorials cover how to make the Drip Irrigation System:

  1. For 1 Plant: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/1-plant-drip-irrigation-assembly-iot-house/
  2. For 2 Plants: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/2-plants-drip-irrigation-assembly-iot-house/

Circuit

The Water Pump Connected to the Relay: The water pump is controlled by the smart switch of the IoT house which has a relay controlling the state. If the relay is ON, the smart switch gets ON, turning on the water pump. The relay has the following connections:

  1. GND Pin connected to GND of the Quarky Expansion Board.
  2. VCC Pin connected to VCC of the Quarky Expansion Board.
  3. Signal Pin connected to Servo 4 of the Quarky Expansion Board.

Python Code

The while loop will run continuously and check if the left pushbutton on the Quarky is pressed or not. If the left pushbutton is pressed, the relay connected to port 4 will be set to ON, otherwise, it will be set to OFF. After each check, the code will wait for one second before checking again.

#This code creates a Quarky object and an IoTHouse object
quarky = Quarky()
house = IoTHouse()

#This while loop will run continuously
while True:
  #If the left pushbutton on the Quarky is pressed...
  if quarky.readpushbutton("L"):
    #Set relay connected to port 4 to ON
    house.setrelay(0, "pwm4")
  #If the left pushbutton is not pressed...
  else:
    #Set relay connected to port 4 to OFF
    house.setrelay(1, "pwm4")

  #Wait one second before checking the pushbutton again
  time.sleep(1)

 

Output

Calibrating the Flow of Water

The flow of water can be adjusted with the loosening and tightening of the red part of the emitter.

Read More
Learn how to create a dataset and Machine Learning Model for an automated shark attack game from the user's input. See how to open the ML environment, upload data, label images, train the model, and export the Python script.

Introduction

In this example project, we are going to create a Machine Learning Model where shark run by the user and fish automatically feed on randomly generated food while escaping from sharks.

Data Collection

  • Now, we are going to collect the data of “Shark Attack: Hungry for Fish” game .
  • This data will contain the actions that we have taken to accomplish the game successfully.
  • We will use the data we collect here, to teach our device how to play the “Shark Attack: Hungry for Fish” game , i.e. to perform machine learning.
  • The data that you collect will get saved in your device as a csv (comma separated values) file. If you open this file in Microsoft Excel, it will look as shown below:

Follow the steps below

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as Python Coding Environment.
  3. Now write code in python.

Code for making dataset

  1. Creates a sprite object named “Fish”. A sprite is typically a graphical element that can be animated or displayed on a screen.
  2. Creates three sprites object named “Orange” , “Shark2” and “Button3” and also upload backdrop of “Underwater2” .
  3. Click on the Fish.py file from the Project files section.
    sprite = Sprite('Fish')
  4. Similarly, declare new sprites on the Fish.py file.
    sprite1 = Sprite('Orange')
    sprite2 = Sprite('Shark 2')
    sprite3 = Sprite('Button3')
  5. Then we will import the time, random, os, math, TensorFlow as tf  and Pandas as pd modules using the import keyword for using delay in the program later.
    1. Time – For using delay in the program.
    2. Random – For using random position.
    3. Pandas as pd – For using Data Frame.
    4. Math– For using math functions in code.
    5. Os– For reading files from Program files.
      import random
      import time
      import tensorflow as tf
      import pandas as pd
      import os
      import math
  6. Now, make 3 variables curr_x, curr_y, shark_x, shark_y, score, chance, fish_d, fish_m, shark_m, angle_f and angle_s with initial values 25, 108, -177, 116, 0, 5, 20, 35, 25, 90 and 90 respectively.
    1.  curr_x – To store the initial x – position of fish.
    2. curr_y – To store the initial y – position of fish.
    3. shark_x – To store the initial x – position of shark.
    4. shark_y – To store the initial y – position of shark.
    5. score – To store the score while playing the game.
    6. chance– To store the chance of fish while playing the game.
    7. fish_d– To store increment value in direction of fish on pressing specific key.
    8. fish_m – To store increment value in movement of fish on pressing specific key.
    9. shark_m – To store increment value in movement of shark on pressing specific key.
    10. shark_d – To store increment value in direction of shark on pressing specific key.
    11. angle_f – To store increment value in angle of fish on pressing specific key.
    12. angle_s – To store increment value in angle of shark on pressing specific key.
      curr_x = 25 
      curr_y = 108 
      shark_x=-177 
      shark_y=116 
      score=0 
      chance=5 
      fish_d=20 
      fish_m=25 
      shark_m=4 
      angle_f=90 
      angle_s=90
  7. Now set initial position and angle of fish and shark both.
    sprite.setx(curr_x)
    sprite.sety(curr_y)
    sprite2.setx(shark_x)
    sprite2.sety(shark_y)
    sprite.setdirection(DIRECTION=angle_f)
    sprite2.setdirection(DIRECTION=angle_s)
  8. Now, make a function settarget1() in which we are generating food at a random position. We pass one argument “m” in the function for generating target food in the greed position of the t gap.
    1. x and y – To generate the fish at random position on stage.
    2. x1 and y1 – To generate the food at random position on stage.
    3. x2 and y2 – To generate the shark at random position on stage.
    4. time.sleep – For giving the time delay.
    5. sprite.set()– To Set the position of fish at random position on stage.
    6. sprite1.set()– To Set the position of food at random position on stage.
    7. sprite2.set()– To Set the position of shark at random position on stage.
      def settarget(t):
        x = random.randrange(-200, 200, t)
        y = random.randrange(-155, 155, t)
        x1 = random.randrange(-200, 200, t)
        y1 = random.randrange(-155, 155, t)
        x2 = random.randrange(-200, 200, t)
        y2 = random.randrange(-155, 155, t)
        time.sleep(0.1)
        sprite1.setx(x1)
        sprite1.sety(y1)
        sprite.setx(x)
        sprite.sety(y)
        sprite2.setx(x2)
        sprite2.sety(y2)
        return x, y, x1, y1, x2, y2
  9. Now, make a function settarget1() in which we are generating food at a random position. We pass one argument “m” in the function for generating target food in the greed position of the t gap.
    1. x and y – To generate the food at random position on stage.
    2. time.sleep – For giving the time delay.
    3. sprite1.set()– To Set the position of food at random position on stage.
      def settarget1(m):
      x = random.randrange(-200, 200, m)
      y = random.randrange(-155, 155, m)
      time.sleep(0.1)
      sprite1.setx(x)
      sprite1.sety(y)
      return x, y 
  10. Now set the target (food). In this, fish are chasing the food, and target_x  and  target_y should be equal to the x and y positions of the food.
    target_x, target_y = settarget(40) 
  11. Now create a data frame of name “Chase_Data.csv” to collect the data for machine learning and if this name csv exist then directly add data in it.
    target_x, target_y = settarget1(40)
    if(os.path.isfile('Chase_Data.csv')):
      data=pd.read_csv('Chase_Data.csv')
    else:
      data = pd.DataFrame({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": angle_f, "direction_s": angle_s, "Action": "RIGHT"}, index=[0])
  12. After that, we will use the while True loop to run the code indefinitely. Don’t forget to add a colon ‘:’ just after the loop to avoid errors.
    while True:
  13. In a while loop, write code by which sharks follow fish by ‘shark_m’ steps.
    sprite2.spriteRequest.requestCommand("motion_pointtowards", {"TOWARDS": "Fish"})
     sprite2.move(shark_m)
  14. Find the direction of shark and fish using the Python pictoblox function and take the floor value of angle.
    angle_f=sprite.direction()
    angle_s=sprite2.direction()
    anglef=math.floor(angle_f)
    angles=math.floor(angle_s)
  15. Now write the script for moving the Fish in forward direction and change clockwise or anticlockwise direction by fix value with the help of a conditional statement.
    1. If the up arrow key is pressed then fish will move fish_m position in same direction.
    2. After pressing the up arrow key action taken should be stored in the Data frame with data.append command.
      if sprite.iskeypressed("up arrow"):
          data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "UP"}, ignore_index=True)
          sprite.move(fish_m)
  16. Repeat the process for the set direction in clockwise or anticlockwise.
    if sprite.iskeypressed("left arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "LEFT"}, ignore_index=True)
        angle = anglef - fish_d
        sprite.setdirection(DIRECTION=angle)
    if sprite.iskeypressed("right arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "RIGHT"}, ignore_index=True)
        angle = anglef + fish_d
        sprite.setdirection(DIRECTION=angle)
  17. Write the conditional statement for the storing data in csv file after few score.
    if(score>0 and score%2==0):
        data.to_csv('Chase_Data.csv',index=False)
  18. Write the conditional statement for the chance variable. If the fish and shark position difference is less than 20, then the chance should be decreased by one.
     if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
        chance= chance-1
  19. Update the position of all three sprites, and if chance becomes 0, then data should be printed on Chase Data. csv file, and the positions of all three sprites change randomly by the functions settarget() and update chance value.
      if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
        chance= chance-1
        curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget(40)
        sprite3.say(("score: ",score ," and chance:  ",chance,""))
        if (chance == 0):
          data.to_csv('Chase_Data.csv',index=False)
          curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget(40)
          chance=5
  20. Again write the conditional statement for the score variable if the fish and food position difference is less then 20 then the score should be increased by one.
    if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20: 
        score = score + 1 
        sprite.say(("your score is: {}".format(score)))
  21. If the score is equal to or greater than 50 then data should be printed on Chase Data.csv file and food positions change randomly by the function settarget1().
    if (score >= 40):
          print(data)
          data.to_csv('Chase Data.csv')
          break
     target_x, target_y = settarget1()
  22. Now update the curr_x, curr_y, shark_x and shark_y variables by storing the current position of the fish and shark and delaying movement by 0.02 seconds.
      curr_x=math.floor(sprite.x())
      curr_y=math.floor(sprite.y())
      shark_x=math.floor(sprite2.x())
      shark_y=math.floor(sprite2.y())
      time.sleep(0.02)
  23. The final code is as follows:
    sprite = Sprite('Fish')
    sprite1 = Sprite('Orange')
    sprite2 = Sprite('Shark 2')
    sprite3 = Sprite('Button3')
    import random
    import time
    import numpy as np
    import tensorflow as tf
    import pandas as pd
    import os
    import math
    		
    curr_x = 25
    curr_y = 108
    shark_x=-177
    shark_y=116
    score=0
    chance=5
    fish_d=20
    fish_m=25
    shark_m=4
    angle_f=90
    angle_s=90
    sprite3.say(("score: ",score ," and chance:  ",chance,""))
    sprite.setx(curr_x)
    sprite.sety(curr_y)
    sprite2.setx(shark_x)
    sprite2.sety(shark_y)
    sprite.setdirection(DIRECTION=angle_f)
    sprite2.setdirection(DIRECTION=angle_s)
    def settarget(t):
      x = random.randrange(-200, 200, t)
      y = random.randrange(-155, 155, t)
      x1 = random.randrange(-200, 200, t)
      y1 = random.randrange(-155, 155, t)
      x2 = random.randrange(-200, 200, t)
      y2 = random.randrange(-155, 155, t)
      time.sleep(0.1)
      sprite1.setx(x1)
      sprite1.sety(y1)
      sprite.setx(x)
      sprite.sety(y)
      sprite2.setx(x2)
      sprite2.sety(y2)
      return x, y, x1, y1, x2, y2
    def settarget1(m):
      x = random.randrange(-200, 200, m)
      y = random.randrange(-155, 155, m)
      time.sleep(0.1)
      sprite1.setx(x)
      sprite1.sety(y)
      return x, y
    target_x, target_y = settarget1(40)
    if(os.path.isfile('Chase_Data.csv')):
      data=pd.read_csv('Chase_Data.csv')
    else:
      data = pd.DataFrame({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": angle_f, "direction_s": angle_s, "Action": "RIGHT"}, index=[0])
    while True:
      # sprite2.pointto()
      sprite2.spriteRequest.requestCommand("motion_pointtowards", {"TOWARDS": "Fish"})
      sprite2.move(shark_m)
      angle_f=sprite.direction()
      angle_s=sprite2.direction()
      anglef=math.floor(angle_f)
      angles=math.floor(angle_s)
      if sprite.iskeypressed("up arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "UP"}, ignore_index=True)
        sprite.move(fish_m)
      if sprite.iskeypressed("left arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "LEFT"}, ignore_index=True)
        angle = anglef - fish_d
        sprite.setdirection(DIRECTION=angle)
        
         
      if sprite.iskeypressed("right arrow"):
        data = data.append({"curr_X": curr_x, "curr_Y": curr_y,"shark_X": shark_x, "shark_Y": shark_y, "tar_x": target_x, "tar_y": target_y, "diff_x":curr_x-target_x, "diff_y":curr_y-target_y, "diff_x1":shark_x-curr_x, "diff_y1":shark_y-curr_y, "direction_f": anglef, "direction_s": angles, "Action": "RIGHT"}, ignore_index=True)
        angle = anglef + fish_d
        sprite.setdirection(DIRECTION=angle)
        
      if(score>0 and score%2==0):
        data.to_csv('Chase_Data.csv',index=False)
      
      if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
        chance= chance-1
        curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget(40)
        sprite3.say(("score: ",score ," and chance:  ",chance,""))
        if (chance == 0):
          data.to_csv('Chase_Data.csv',index=False)
          curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget(40)
          chance=5
      if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20:
        score = score + 1
        sprite3.say(("score: ",score ," and chance:  ",chance,""))
        if (score >= 50):
          data.to_csv('Chase_Data.csv',index=False)
          break
        target_x, target_y = settarget1(40)
      curr_x=math.floor(sprite.x())
      curr_y=math.floor(sprite.y())
      shark_x=math.floor(sprite2.x())
      shark_y=math.floor(sprite2.y())
      time.sleep(0.02)
    
  24. Press the Run button and play fish feast game to collect data.
  25. Store this dataset on your local computer.

    Numbers(C/R) in Machine Learning Environment

    Datasets on the internet are hardly ever fit to directly train on. Programmers often have to take care of unnecessary columns, text data, target columns, correlations, etc. Thankfully, PictoBlox’s ML Environment is packed with features to help us pre-process the data as per our liking.

    Let’s create the ML model.

    Opening Image Classifier Workflow

    Alert: The Machine Learning Environment for model creation is available in the only desktop version of PictoBlox for Windows, macOS, or Linux. It is not available in Web, Android, and iOS versions.

    Follow the steps below:

    1. Open PictoBlox and create a new file.
    2. Select the coding environment as Block Coding Environment.
    3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
    4. You’ll be greeted with the following screen.
      Click on “Create New Project“.
    5. You shall see the Numbers C/R workflow with an option to either “Upload Dataset” or “Create Dataset”.

      Uploading/Creating Dataset

      Datasets can either be uploaded or created on the ML Environment. Lets see how it is done.

      Uploading a dataset
      1. To upload a dataset, click on the Upload Dataset button and the Choose CSV from your files button.
        Note: An uploaded dataset must be a “.csv” file.
      2. Once uploaded the first 50 rows of the uploaded CSV document will show up in the window.

      3. If you look at the output column, all the values are currently “0”. Hence, first we need to create an output column.
        1. In the Dataset table, click on the tick near Select All to de-select all the columns.
        2. click on the tick of Action column to select it. We will make this column the output.
        3. The output column must always be numerical. Hence click on the button Text to Number to convert the data within this column to numerical type.
        4. Now select it again and press the Set as Output button to set this column as Output.
        5. There is also many which is not useful in training our model and needs to be disable. So select it and click the Disable button in the Selected columns section.

          Creating a Dataset
          1. To create a dataset, click on the Create Dataset button.
          2. Select the number of rows and columns that are to be added and click on the Create button. More rows and columns can be added as and when needed.

          Notes:

          1. Each column represents a feature. These are the values used by the model to train itself.
          2. The “Output” column contains the target values. These are the values that we expect the model to return when features are passed.
          3. The window only shows the first 50 rows of the dataset.
          4. Un-check the “Select All” checkbox to un-select all the columns.

          Training the Model

          After data is pre-processed and optimized, it’s fit to be used in model training. To train the model, simply click the “Train Model” button found in the “Training” panel.

          By training the model, meaningful information is extracted from the numbers, and that in turn updates the weights. Once these weights are saved, the model can be used to make predictions on data previously unseen.

          The model’s function is to use the input data and predict the output. The target column must always contain numbers.

          However, before training the model, there are a few hyperparameters that need to be understood. Click on the “Advanced” tab to view them.

          There are three hyperparameters that can be altered in the Numbers(C/R) Extension:

          1. Epochs– The total number of times the data will be fed through the training model. Therefore, in 10 epochs, the dataset will be fed through the training model 10 times. Increasing the number of epochs can often lead to better performance.
          2. Batch Size– The size of the set of samples that will be used in one step. For example, if there are 160 data samples in the dataset, and the batch size is set to 16, each epoch will be completed in 160/16=10 steps. This hyperparameter rarely needs any altering.
          3. Learning Rate– It dictates the speed at which the model updates the weights after iterating through a step. Even small changes in this parameter can have a huge impact on the model performance. The usual range lies between 0.001 and 0.0001.
          Note: Hover the mouse pointer over the question mark next to the hyperparameters to see their description.

          It’s a good idea to train a numeric classification model for a high number of epochs. The model can be trained in both JavaScript and Python. In order to choose between the two, click on the switch on top of the Training panel.

          Alert: Dependencies must be downloaded to train the model in Python, JavaScript will be chosen by default.

          The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch.

          A window will open. Type in a project name of your choice and select the “Numbers(C/R)” extension. Click the “Create Project” button to open the Numbers(C/R) window.

          Testing the Model

          To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

          The model will return the probability of the input belonging to the classes.

      Export in Python Coding

      Click on the “PictoBlox” button, and PictoBlox will load your model into the Python Coding Environment if you have opened the ML Environment in Python Coding.

    Code

    1. Creates a sprite object named “Fish”. A sprite is typically a graphical element that can be animated or displayed on a screen.
    2. Creates three sprites object named “Orange” , “Shark2” and “Button3” and also upload backdrop of “Underwater2” .
    3. Click on the Fish.py file from the Project files section.
      sprite = Sprite('Fish')
    4. Similarly, declare new sprites on the Fish.py file.
      sprite1 = Sprite('Orange')
      sprite2 = Sprite('Shark 2')
      sprite3 = Sprite('Button3')
    5. Then we will import the time, random, os, math, TensorFlow as tf  and Pandas as pd modules using the import keyword for using delay in the program later.
      1. Time – For using delay in the program.
      2. Random – For using random position.
      3. Pandas as pd – For using Data Frame.
      4. Math– For using math functions in code.
      5. Os– For reading files from Program files.
        import random
        import time
        import tensorflow as tf
        import pandas as pd
        import os
        import math
    6. Now, make 3 variables curr_x, curr_y, ang_f, mov_f and score with initial values 4, 3, 50, and 0 respectively.
      1. curr_x – To store the initial x – position of fish.
      2. curr_y – To store the initial y – position of fish.
      3. shark_x – To store the initial x – position of shark.
      4. shark_y – To store the initial y – position of shark.
      5. score – To store the score while playing the game.
      6. chance– To store the chance of fish while playing the game.
      7. fish_d– To store increment value in direction of fish on pressing specific key.
      8. fish_m – To store increment value in movement of fish on pressing specific key.
      9. shark_m – To store increment value in movement of shark on pressing specific key.
      10. shark_d – To store increment value in direction of shark on pressing specific key.
      11. angle_f – To store increment value in angle of fish on pressing specific key.
      12. angle_s – To store increment value in angle of shark on pressing specific key.
        curr_x = 25 
        curr_y = 108 
        shark_x=-177 
        shark_y=116 
        score=0 
        chance=5 
        fish_d=20 
        fish_m=35 
        shark_m=25 
        shark_d= 20 
        angle_f=90 
        angle_s=90
    7. Now set initial position and angle of fish and shark both.
      sprite.setx(curr_x) 
      sprite.sety(curr_y) 
      sprite2.setx(shark_x) 
      sprite2.sety(shark_y) 
      sprite.setdirection(DIRECTION=angle_f) sprite2.setdirection(DIRECTION=angle_s)
    8. Now, make a function settarget1() in which we are generating food at a random position. We pass one argument “t” in the function for generating target food in the greed position of the t gap.
      1. x and y – To generate the fish at random position on stage.
      2. x1 and y1 – To generate the food at random position on stage.
      3. x2 and y2 – To generate the shark at random position on stage.
      4. time.sleep – For giving the time delay.
      5. sprite.set()– To Set the position of fish at random position on stage.
      6. sprite1.set()– To Set the position of food at random position on stage.
      7. sprite2.set()– To Set the position of shark at random position on stage.
        def settarget1(t):
          x = random.randrange(-200, 200, t)
          y = random.randrange(-155, 155, t)
          x1 = random.randrange(-200, 200, t)
          y1 = random.randrange(-155, 155, t)
          x2 = random.randrange(-200, 200, t)
          y2 = random.randrange(-155, 155, t)
          time.sleep(0.1)
          sprite1.setx(x1)
          sprite1.sety(y1)
          sprite.setx(x)
          sprite.sety(y)
          sprite2.setx(x2)
          sprite2.sety(y2)
          return x, y, x1, y1, x2, y2
    9. Now, make a function settarget() in which we are generating food at a random position. We pass one argument “m” in the function for generating target food in the greed position of the t gap.
      1. x and y – To generate the food at random position on stage.
      2. time.sleep – For giving the time delay.
      3. sprite1.set()– To Set the position of food at random position on stage.
        def settarget(m):
        x = random.randrange(-200, 200, m)
        y = random.randrange(-155, 155, m)
        time.sleep(0.1)
        sprite1.setx(x)
        sprite1.sety(y)
        return x, y 
    10. Now set the target (food). In this, fish are chasing the food, and target_x  and  target_y should be equal to the x and y positions of the food.
      target_x, target_y = settarget(40)
    11. Now, make a function runprediction() in which we are predicting class (Left, Up, right) by taking argument from user . We pass three arguments “diff_x”, “diff-y”, “diff_x1”, “diff_y1”, “ang1”, “ang2” in the function.
      1. inputvalue – To store input parameters of function in array.
      2. model.predict() – For predicting output from trained model.
      3. np.argmax(,)– To find the most probable prediction output.
        def runprediction(diff_x, diff_y, diff_x1, diff_y1, ang1, ang2):
          inputValue=[diff_x, diff_y, diff_x1, diff_y1, ang1, ang2]
          #Input Tensor
          inputTensor = tf.expand_dims(inputValue, 0)
          #Predict
          predict = model.predict(inputTensor)
          predict_index = np.argmax(predict[0], axis=0)
          #Output
          predicted_class = class_list[predict_index]
          return predicted_class
    12. After that, we will use the while True loop to run the code indefinitely. Don’t forget to add a colon ‘:’ just after the loop to avoid errors.
      While True :
    13. Now write the script for moving the Shark in forward direction and change clockwise or anticlockwise direction by fix value with the help of a conditional statement.
      1. If the up arrow key is pressed then fish will move fish_m position in same direction.
      2. After pressing the up arrow key, the shark_x and shark_y variables update by storing the current position of the shark.
        if sprite.iskeypressed("up arrow"):
            sprite2.move(shark_m)
            shark_x=sprite2.x()
            shark_y=sprite2.y() 
    14. Repeat the process for the set direction in clockwise or anticlockwise.
        if sprite.iskeypressed("left arrow"):
          angles = angle_s - shark_d
          sprite2.setdirection(DIRECTION=angles)
          
        if sprite.iskeypressed("right arrow"):
          angles = angle_s + shark_d
          sprite2.setdirection(DIRECTION=angles)
    15. Find the direction of shark and fish using the Python pictoblox function and store prediction value in ‘move’ variable.
      angle_f=sprite.direction()
      angle_s=sprite2.direction()
      move = runprediction(curr_x- target_x, curr_y-target_y, shark_x-curr_x, shark_y-curr_y, angle_f, angle_s)
    16. Now write the script for moving the Fish in forward direction and change clockwise or anticlockwise direction by fix value with the help of a conditional statement.
      1. If the predicted value is “UP” then fish will move fish_m position in same direction.
      2. If the predicted value is “LEFT” then fish will change direction by some constant value in anticlockwise direction.
      3. If the predicted value is “RIGHT” then fish will change direction by some constant value in clockwise direction.
        if move == "UP":
            sprite.move(fish_m)
            curr_x=sprite.x()
            curr_y=sprite.y()
        
        if move == "LEFT":
            angle = angle_f - fish_d
            sprite.setdirection(DIRECTION=angle)
        
        if move == "RIGHT":
            angle = angle_f + fish_d
            sprite.setdirection(DIRECTION=angle)
    17. Write the conditional statement for the chance variable. If the fish and shark position difference is less than 20, then the chance should be decreased by one.
       if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
          chance= chance-1
    18. Update the position of all three sprites, and if chance becomes 0, then the positions of all three sprites change randomly by the functions settarget1() and update chance value.
        if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
          chance= chance-1
          curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget1(40)
          sprite3.say(("score: ",score ," and chance:  ",chance,""))
          if (chance == 0):
            chance=5
    19. Again write the conditional statement for the score variable if the fish and food position difference is less then 20 then the score should be increased by one and food positions change randomly by the function settarget().
      if abs(curr_x-target_x)<20 and abs(curr_y-target_y)<20: 
          score = score + 1 
          sprite.say(("your score is: {}".format(score)))
          target_x, target_y = settarget(4)
    20. The final code is as follows:
      sprite = Sprite('Fish')
      sprite1 = Sprite('Orange')
      sprite2 = Sprite('Shark 2')
      sprite3 = Sprite('Button3')
       
      import random
      import time
      import numpy as np
      import tensorflow as tf
      import pandas as pd
      import os
      import math
      #Load Number Model
      model= tf.keras.models.load_model(
      		"num_model.h5", 
      		custom_objects=None, 
      		compile=True, 
      		options=None)
      		
      #List of classes
      class_list = ['UP','RIGHT','LEFT',]  
      		
      curr_x = 25
      curr_y = 108
      shark_x=-177
      shark_y=116
      score=0
      chance=5
      fish_d=20
      fish_m=35
      shark_m=25
      shark_d=20
      angle_f=90
      angle_s=90
      
      sprite3.say(("score: ",score ," and chance:  ",chance,""))
       
      sprite.setx(curr_x) 
      sprite.sety(curr_y) 
      sprite2.setx(shark_x) 
      sprite2.sety(shark_y) 
      sprite.setdirection(DIRECTION=angle_f) 
      sprite2.setdirection(DIRECTION=angle_s)
      
      def settarget1(t):
        x = random.randrange(-200, 200, t)
        y = random.randrange(-155, 155, t)
        x1 = random.randrange(-200, 200, t)
        y1 = random.randrange(-155, 155, t)
        x2 = random.randrange(-200, 200, t)
        y2 = random.randrange(-155, 155, t)
        time.sleep(0.1)
        sprite1.setx(x1)
        sprite1.sety(y1)
        sprite.setx(x)
        sprite.sety(y)
        sprite2.setx(x2)
        sprite2.sety(y2)
        return x, y, x1, y1, x2, y2
        
      def settarget(m):
        x = random.randrange(-200, 200, m)
        y = random.randrange(-155, 155, m)
        time.sleep(0.1)
        sprite1.setx(x)
        sprite1.sety(y)
        return x, y
        
      target_x, target_y = settarget(40)
      def runprediction(diff_x, diff_y, diff_x1, diff_y1, ang1, ang2):
        inputValue=[diff_x, diff_y, diff_x1, diff_y1, ang1, ang2]
        #Input Tensor
        inputTensor = tf.expand_dims(inputValue, 0)
        #Predict
        predict = model.predict(inputTensor)
        predict_index = np.argmax(predict[0], axis=0)
        #Output
        predicted_class = class_list[predict_index]
        return predicted_class
      while True:
        if sprite.iskeypressed("up arrow"):
          sprite2.move(shark_m)
          shark_x=sprite2.x()
          shark_y=sprite2.y()
          
        if sprite.iskeypressed("left arrow"):
          angles = angle_s - shark_d
          sprite2.setdirection(DIRECTION=angles)
        if sprite.iskeypressed("right arrow"):
          angles = angle_s + shark_d
          sprite2.setdirection(DIRECTION=angles)
       
        angle_f=sprite.direction()
        angle_s=sprite2.direction()
        move = runprediction(curr_x- target_x, curr_y-target_y, shark_x-curr_x, shark_y-curr_y, angle_f, angle_s)
        
        if move == "UP":
          sprite.move(fish_m)
          curr_x=sprite.x()
          curr_y=sprite.y()
      
        if move == "LEFT":
          angle = angle_f - fish_d
          sprite.setdirection(DIRECTION=angle)
      
        if move == "RIGHT":
          angle = angle_f + fish_d
          sprite.setdirection(DIRECTION=angle)
       
        if abs(shark_x-curr_x)<20 and abs(shark_y-curr_y)<20:
          chance= chance-1
          curr_x, curr_y, target_x, target_y, shark_x, shark_y = settarget1(40)
          sprite3.say(("score: ",score ," and chance:  ",chance,""))
          if (chance == 0):
            chance=5
        if abs(curr_x-target_x)<35 and abs(curr_y-target_y)<35:
          score = score + 1
          sprite3.say(("score: ",score ," and chance:  ",chance,""))
          target_x, target_y = settarget(4)
      
        time.sleep(0.2) 
      

      Final Result

      Conclusion

      Creating a Machine Learning Model of “Shark Attack: Hungry for Fish” game can be both complex and time-consuming. Through the steps demonstrated in this project, you can create your own Machine Learning Model of automated game. Once trained, you can export the model into the Python Coding Environment, where you can tweak it further to give you the desired output. Try creating a Machine Learning Model of your own today and explore the possibilities of Number Classifier in PictoBlox!

Read More
This example demonstrates how to use the Soil Moisture sensor to detect the moisture in the soil and water the plant using the drip system. The system will water the plant when the moisture of the soil is low.

This example demonstrates how to use the Soil Moisture sensor to detect the moisture in the soil and water the plant using the drip system. The system will water the plant when the moisture of the soil is low.

Drip Irrigation Assembly Guide

The following tutorials cover how to make the Drip Irrigation System:

  1. For 1 Plant: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/1-plant-drip-irrigation-assembly-iot-house/
  2. For 2 Plants: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/2-plants-drip-irrigation-assembly-iot-house/

Circuit

We are using 2 devices in this project:

  1. Moisture Sensor: The moisture sensor provides real-time moisture reading from the soil. The moisture sensor connections are as follows:
    1. GND Pin connected to GND of the Quarky Expansion Board.
    2. VCC Pin connected to VCC of the Quarky Expansion Board.
    3. Signal Pin connected to A2 of the Quarky Expansion Board.
  2. The Water Pump Connected to the Relay: The water pump is controlled by the smart switch of the IoT house which has a relay controlling the state. If the relay is ON, the smart switch gets ON, turning on the water pump. The relay has the following connections:
    1. GND Pin connected to GND of the Quarky Expansion Board.
    2. VCC Pin connected to VCC of the Quarky Expansion Board.
    3. Signal Pin connected to Servo 8 of the Quarky Expansion Board.

Note:  Once the connection is done, make sure you have the drip irrigation system placed with some water in the tank.

Script

The project has 3 scripts:

  1. Script to display the real-time moisture level on the display of the Quakry. Based on the moisture value, the number of LEDs will light up. The script is a custom block defined with the name – Display Soil Moisture Level.
  2. The second script is another LED display showing the animation of the watering. The script is a custom block defined with the name – Watering Animation.
  3. The final script is the main script which has the logic to detect the moisture value. If the value becomes less than 30%, the pump gets ON and the watering continues until the moisture level gets to 80%.

Output

Uploading Code

You can also make the script work independently of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

Read More
Automate watering your plants using PictoBlox and Python with the help of the Soil Moisture sensor and the IoT House Quarky Addon Kit. Learn how to assemble the drip irrigation system, set up the circuit, and write Python code to control it.

This example demonstrates how to use the Soil Moisture sensor to detect the moisture in the soil and water the plant using the drip system in PictoBlox Python Environment. The system will water the plant when the moisture of the soil is low.

Drip Irrigation Assembly

The following tutorials cover how to make the Drip Irrigation System:

  1. For 1 Plant: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/1-plant-drip-irrigation-assembly-iot-house/
  2. For 2 Plants: https://ai.thestempedia.com/docs/iot-house-quarky-addon-kit-documentation/2-plants-drip-irrigation-assembly-iot-house/

Circuit

We are using 2 devices in this project:

  1. Moisture Sensor: The moisture sensor provides real-time moisture reading from the soil. The moisture sensor connections are as follows:
    1. GND Pin connected to GND of the Quarky Expansion Board.
    2. VCC Pin connected to VCC of the Quarky Expansion Board.
    3. Signal Pin connected to A2 of the Quarky Expansion Board.
  2. The Water Pump Connected to the Relay: The water pump is controlled by the smart switch of the IoT house which has a relay controlling the state. If the relay is ON, the smart switch gets ON, turning on the water pump. The relay has the following connections:
    1. GND Pin connected to GND of the Quarky Expansion Board.
    2. VCC Pin connected to VCC of the Quarky Expansion Board.
    3. Signal Pin connected to Servo 8 of the Quarky Expansion Board.

Note:  Once the connection is done, make sure you have the drip irrigation system placed with some water in the tank.

Python Code for Stage Mode

The code does the following:

  1. This code creates two instances, one of the Quarky class called ‘quarky’ and one of the IoTHouse class called ‘house’. It also imports the time library which contains functions for working with time.
  2. The code then creates two functions.
    1. The first, Display_Soil_Moisture_Level, uses the Quarky and IoTHouse instances to read the soil moisture level from the IoTHouse and then display the moisture level on the Quarky.
    2. The second, Watering_Animation, uses the Quarky instance to display a watering animation on the Quarky.
  3. Finally, the code sets the moisture pin on the IoTHouse to A2 and enters an infinite loop. The loop calls the Display_Soil_Moisture_Level function, and if the moisture level is below 30, it turns on the relay on the IoTHouse and calls the Watering_Animation function until the moisture level is above 80. Once the moisture level is above 80, the loop turns off the relay 0 on the IoTHouse and pauses for 2 seconds before repeating.
# Create an instance of the Quarky class called quarky
quarky = Quarky()

# Create an instance of the IoTHouse class called house
house = IoTHouse()

# Import the time library which contains functions for working with time
import time


# Create a function called Display_Soil_Moisture_Level which will display the soil moisture level on the Quarky
def Display_Soil_Moisture_Level():
  # Clear the display on Quarky
  quarky.cleardisplay()
  # Read the moisture level from the IoTHouse
  Moisture = house.readmoisture()
  # Iterate through each row and column of the display
  for row in range(1, 6):
    for column in range(1, 8):
      # If the current row and column number is less than the moisture level
      # Turn on the corresponding LED on Quarky
      if ((((row - 1) * 20) + (column * (10 / 7))) < Moisture):
        quarky.setled(column, row, [0, 255, 0], 33)


# Create a function called Watering_Animation to display a watering animation on the Quarky
def Watering_Animation():
  # Draw the first pattern on Quarky
  quarky.drawpattern("jfjjjfjjdjjjjjfdfjjjjjfjjjjjjjjjjjj")
  time.sleep(0.2)
  # Draw the second pattern on Quarky
  quarky.drawpattern("jjjjfdfjfjjjfjjdjjjjjfdfjjjjjfjjjjj")
  time.sleep(0.2)
  # Draw the third pattern on Quarky
  quarky.drawpattern("jjjjjdjjjjjfdfjfjjjfjjdjjjjjfdfjjjj")
  time.sleep(0.2)
  # Draw the fourth pattern on Quarky
  quarky.drawpattern("jjjjjfjjjjjjdjjjjjfdfjfjjjfjjdjjjjj")
  time.sleep(0.2)
  # Draw the fifth pattern on Quarky
  quarky.drawpattern("jjjjjjjjjjjjfjjjjjjdjjjjjfdfjfjjjfj")
  time.sleep(0.2)
  # Draw the sixth pattern on Quarky
  quarky.drawpattern("jfjjjjjjjjjjjjjjjjjfjjjjjjdjjjjjfdf")
  time.sleep(0.2)
  # Draw the seventh pattern on Quarky
  quarky.drawpattern("fdfjjjjjfjjjjjjjjjjjjjjjjjfjjjjjjdj")
  time.sleep(0.2)
  # Draw the eighth pattern on Quarky
  quarky.drawpattern("jdjjjjjfdfjjjjjfjjjjjjjjjjjjjjjjjfj")
  time.sleep(0.2)


# Set the moisture pin on the IoT House to A2
house.setmoisturepin("A2")

# Create an infinite loop
while True:
  # Call the Display_Soil_Moisture_Level function
  Display_Soil_Moisture_Level()

  # If the moisture level is below 30
  if (house.readmoisture() < 30):
    # Turn on the relay 0 on the IoTHouse
    house.setrelay(0, "pwm4")
    # Create a loop which will run until the moisture level is above 80
    while not ((house.readmoisture() > 80)):
      # Call the Watering_Animation function twice
      Watering_Animation()
    # Turn off the relay 0 on the IoTHouse
    house.setrelay(1, "pwm4")
  # Pause for 2 seconds
  time.sleep(2)

Output

Python Code for Upload Mode

You can also make the script work independently of PictoBlox using the Upload Mode. For that switch to upload mode and replace the when green flag clicked block with when Quarky starts up the block.

 

Only the object initialization changes in the Upload Mode.

# Create an instance of the Quarky class called quarky
from quarky import *

# Create an instance of the IoTHouse class called house
import iothouse
house = iothouse.iothouse()

# Import the time library which contains functions for working with time
import time

# Create a function called Display_Soil_Moisture_Level which will display the soil moisture level on the Quarky
def Display_Soil_Moisture_Level():
  # Clear the display on Quarky
  quarky.cleardisplay()
  # Read the moisture level from the IoTHouse
  Moisture = house.readmoisture()
  # Iterate through each row and column of the display
  for row in range(1, 6):
    for column in range(1, 8):
      # If the current row and column number is less than the moisture level
      # Turn on the corresponding LED on Quarky
      if ((((row - 1) * 20) + (column * (10 / 7))) < Moisture):
        quarky.setled(column, row, [0, 255, 0], 33)


# Create a function called Watering_Animation to display a watering animation on the Quarky
def Watering_Animation():
  # Draw the first pattern on Quarky
  quarky.drawpattern("jfjjjfjjdjjjjjfdfjjjjjfjjjjjjjjjjjj")
  time.sleep(0.2)
  # Draw the second pattern on Quarky
  quarky.drawpattern("jjjjfdfjfjjjfjjdjjjjjfdfjjjjjfjjjjj")
  time.sleep(0.2)
  # Draw the third pattern on Quarky
  quarky.drawpattern("jjjjjdjjjjjfdfjfjjjfjjdjjjjjfdfjjjj")
  time.sleep(0.2)
  # Draw the fourth pattern on Quarky
  quarky.drawpattern("jjjjjfjjjjjjdjjjjjfdfjfjjjfjjdjjjjj")
  time.sleep(0.2)
  # Draw the fifth pattern on Quarky
  quarky.drawpattern("jjjjjjjjjjjjfjjjjjjdjjjjjfdfjfjjjfj")
  time.sleep(0.2)
  # Draw the sixth pattern on Quarky
  quarky.drawpattern("jfjjjjjjjjjjjjjjjjjfjjjjjjdjjjjjfdf")
  time.sleep(0.2)
  # Draw the seventh pattern on Quarky
  quarky.drawpattern("fdfjjjjjfjjjjjjjjjjjjjjjjjfjjjjjjdj")
  time.sleep(0.2)
  # Draw the eighth pattern on Quarky
  quarky.drawpattern("jdjjjjjfdfjjjjjfjjjjjjjjjjjjjjjjjfj")
  time.sleep(0.2)


# Set the moisture pin on the IoT House to A2
house.setmoisturepin("A2")

# Create an infinite loop
while True:
  # Call the Display_Soil_Moisture_Level function
  Display_Soil_Moisture_Level()

  # If the moisture level is below 30
  if (house.readmoisture() < 30):
    # Turn on the relay 0 on the IoTHouse
    house.setrelay(0, "pwm4")
    # Create a loop which will run until the moisture level is above 80
    while not ((house.readmoisture() > 80)):
      # Call the Watering_Animation function twice
      Watering_Animation()

    # Turn off the relay 0 on the IoTHouse
    house.setrelay(1, "pwm4")
  # Pause for 2 seconds
  time.sleep(2)
Read More
All articles loaded
No more articles to load
[PictoBloxExtension]