Deprecated: Required parameter $query follows optional parameter $post in /var/www/html/wp-content/plugins/elementor-extras/modules/breadcrumbs/widgets/breadcrumbs.php on line 1215
Mars Rover - Blocks, Python Functions, Projects | PictoBlox Extension
[PictoBloxExtension]

Mars Rover

Mars rover extension graphics
Extension Description
Control the Mars Rover Robot with Quarky.

Introduction

Mars is a fascinating planet. It’s icy cold and covered in reddish dust and dirt. Like Earth, it has volcanoes, gullies, and flat plains. Scientists can also see channels that look like they were carved by rivers and streams a long, long time ago. Over the years, NASA has sent five robotic vehicles, called rovers, to Mars. The names of the five rovers are Sojourner, Spirit and Opportunity, Curiosity, and Perseverance.

STEMpedia has created a miniature version of the Mars Rover to educate students about the different aspects of the Mars Rover in a more practical way.

The Mars Rover is programmable with PictoBlox in both modes – Block Coding and Python Coding.

Motor and Servo Motor

In our Mars rover, there are a total of 6 motors and 5 servo motors. 

The motors provide rotation to the wheels which helps the rover to attain motion in both forward and backward directions.  All the left side motors (3 motors) are connected to the left motor port of Quarky and all the right side motors (3 motors) are connected to the right motor port of Quarky using a 3 port wire. This means that to control the Mars rover we have to control only 2 motors – Left and Right. Also, there are 2 parameters to control – Direction (Forward or Backward) and Speed. With this control, the Mars rover can do all the desired motions.

The servo motors help in providing rotation to the complete wheel assembly so that the rover can change its wheel alignments and so its path. These play a major role in turning cases of the mars rover. The front two and the back two-wheel assemblies are connected with one servo motor each. Some important turns:

  1. Turning left on the same point:
  2. Turning right on the same point:
  3. Turning left on a circle:
  4. Turning right on a circle:
Note:  The middle wheel assemblies on either side are not connected to servo motors.

The fifth servo motor is connected to the head portion of the Mars rover so that the rover can rotate its head to detect obstacles. 

Connecting Mars Rover with PictoBlox

Let’s begin by first connecting Quarky to PictoBlox. Select your preferred type of device i.e. either the desktop/laptop or your smartphone and follow the instructions.

Desktop

Follow the steps below for connecting Quarky to PictoBlox:

  1. First, connect Quarky to your laptop using a USB cable.
    Power Quarky
  2. Next, open PictoBlox on your desktop.
  3. After that, select Block or Python Coding as your coding environment.
  4. Then, click the Board button in the toolbar and select board as Quarky.
  5. Next, select the appropriate Serial port if the Quarky is connected via USB or the Bluetooth Port if you want to connect Quarky via Bluetooth and press Connect.
    COM Port

And voila! Quarky is now connected to PictoBlox.

Mobile

Follow the steps below for connecting Quarky to PictoBlox:

  1. First, power ON Quarky.
  2. Open PictoBlox on your smartphone. Go to My Space and make a new project by clicking the ‘+(plus)’ button in the bottom-right corner.
    PictoBlox in Mobile Phone
  3. Then, tap the Board button in the top-right corner of the toolbar.
    PictoBlox BoardSelect board as Quarky.
  4. Next, tap the Connect button:
    PictoBlox ConnectSelect your device from the list.

And voila! Quarky is now connected to PictoBlox.

Mars Rover Extension in Block Coding

Click on the Add Extension button and add the Mars Rover extension.

Once loaded, you will find the fresh blocks of Mars Rover in the block palette.

 

Mars Rover Extension in Python Coding

Click on the Add Modules/Libraries button and add the Mars Rover extension.

To access the library functions, you have to add the object declaration.

quarky = Quarky()
rover = MarsRover(4, 1, 7, 2, 6)
Read More

PictoBlox Blocks

The block performs the selected action for the quadruped. The action runs for the specified times and at the specified speed.
The block reports the current time.
The block performs the selected motion for the humanoid. The motion runs for the specified times and at the specified speed.
The block stops all the motors of the robot.
The Block makes a request to ChatGPT to define the text specified in it. The response of ChatGPT is then stored in PictoBlox and can be accessed using the get AI response block.
This block sets the output of a selected PWM pin of an Arduino Uno, Arduino Mega, or Arduino Nano board to a value from 0 to 255. When set to 128, the output will be high for half the time, and low for the other half. This allows users to control the voltage output to an attached device.
This block writes a specific text, such as “Hello, World!”, onto an LCD display. It is useful for creating simple text-based user interfaces for electronic projects or devices.
This block resets all the servo motors of the robotic arm to their default angle which is commonly referred to as the ‘home’ position.
This block allows you to adjust the robot’s turning speed.
Starts the script whnever a message of specific color is recieved.
Moves the sprite a specified number of grid squares down.
Increases the sprite’s size.
Stop all the sprites’s scripts.
This block enables setting the instrument for the upcoming musical note.
This clears all the input from stage such as pen and stamp.
After connection is established, rotates the quarky a specified number of step to the right .
Shows a specified static emotion on the quarky LED display.
Detects and identifies the facial expression within a view captured by camera
Detects and identifies signs made by hand within a view captured by camera.
After connection is established, moves the wizbot a specified number of step Back.
After connection is established, rotates the wizbot a specified degree of angle to the left.
This block is used to write text on evive’s TFT display.
evive has two potentiometers whose analog outputs can be varied by turning the knob clockwise or anti-clockwise. This block returns the analog output of either of the potentiometer (from 0-1023).
There are 10 digital buttons in the gamepad module, whose data is sent to the device when they are pressed or released. The block reports whether the button is currently pressed on the gamepad or not. If the chosen button is pressed, then it returns true, else it returns false. 
The block set the relay connected to the specified digital pin to ON or OFF.
This block is used to set the angles at which the gripper of the robotic arm opens and closes. You need to use this block every time, you open or close the gripper as this block defines at which angles the gripper claw is opened and at which it is closed.
This block should be included every time you work with the humanoid robot for the first time as it calibrates the angles of all the four servo motors of the arm(2 servos of shoulder + 2 servos of hands) and saves it in the memory of evive.
The block points its sprite towards the mouse-pointer or another sprite depending on its costume center; this changes the sprite’s direction and rotates the sprite.
The block changes its sprite’s costume to a specified one.
All articles loaded
No more articles to load

Block Coding Examples

All articles loaded
No more articles to load

Python Functions

The function enables the automatic display of the landmark on pose/hand detected on the stage.
Syntax: enablebox()
The function disables the automatic display of the landmark on pose/hand detected on the stage.
Syntax: disablebox()
This function is used to analyze the image received as input from the stage, for human pose detection.
Syntax: analysestage()
This function returns the x position of the pose landmark detected. The position is mapped with the stage coordinates.
Syntax: x(landmark_number = 1, pose_number = 1)
This function returns the y position of the pose landmark detected. The position is mapped with the stage coordinates.
Syntax: y(landmark_number = 1, pose_number = 1)
The function tells whether the human pose is detected or not.
Syntax: isdetected(landmark_number = 1, pose_number = 1)
This function is used to analyze the image received as input from the camera, for human hand detection.
Syntax: analysehand()
The function tells whether the human hand is detected or not.
Syntax: ishanddetected()
This function returns the specified parameter of the hand landmark detected.
Syntax: gethandposition(parameter = 1, landmark_number = 4)
This function returns the x position of the hand detected. The position is mapped with the stage coordinates.
Syntax: handx()
This function returns the y position of the hand detected. The position is mapped with the stage coordinates.
Syntax: handy()
The function adds the specified text data to the specified class.
Syntax: pushdata(text_data = “your text”, class_label = “class”)
The function trains the NLP model with the data added with pushdata() function.
Syntax: train()
The function resets and clears the NLP model.
Syntax: reset()
The function analyses the specified test and provides the class name under which it has been classified by the NLP model.
Syntax: analyse(text = “your text”)
The function is used to control the state of the camera.
Syntax: video(video_state = “on”, transparency = 1)
The function enables the automatic display of the box on object detection on the stage.
Syntax: enablebox()
The function disables the automatic display of the box on object detection on the stage.
Syntax: disablebox()
This function is used to set the threshold for the confidence (accuracy) of object detection, 0 being low confidence and 1 being high confidence.
Syntax: setthreshold(threshold = 0.5)
This function is used to analyze the image received as input from the camera, for objects.
Syntax: analysecamera()
This function is used to analyze the image received as input from the stage, for objects.
Syntax: analysestage()
This function returns the total number of objects detected in the camera feed or the stage.
Syntax: count()
This function is used to get the class name of the analyzed object.
Syntax: classname(object = 1)
This function returns the x position of the object detected. You can specify the object for which the value is needed. The position is mapped with the stage coordinates.
Syntax: x(object = 1)
This function returns the y position of the object detected. You can specify the object for which the value is needed. The position is mapped with the stage coordinates.
Syntax: y(object = 1)
This function returns the width of the object detected. You can specify the object for which the value is needed. The position is mapped with the stage coordinates.
Syntax: width(object = 1)
This function returns the height of the object detected. You can specify the object for which the value is needed. The position is mapped with the stage coordinates.
Syntax: height(object = 1)
This function is used to get the confidence (accuracy) of object detection, 0 being low confidence and 1 being high confidence.
Syntax: confidence(object = 1)
The function returns whether the specified signal is detected in the analysis or not.
Syntax: issignaldetected(signal_name = “Go”)
The function returns the specified parameter for the specified signal detected.
Syntax: getsignaldetail(signal_name = “Go”, parameter_value = 1)
All articles loaded
No more articles to load

Python Coding Examples

All articles loaded
No more articles to load
Table of Contents