Skip to content

Week 14. Interface and Application Programming


Image description

Image Courtesy: Photo by BENCE BOROS on Unsplash

This week seems to be little scary, since the topic is very vast. The topic of Interface and Application Programming is quite broad and encompasses various aspects of software development.

Here we are going to explore the applications which can interface with devices. It can be a mobile application or simply an application runs on the computer which connects with the board we made. I decided to try and make an app to connect to one of my board.

Assignment Tasks:

  1. Group Assignment: Compare as many tool options as possible.

  2. Individual Assignment: Write an application that interfaces a user with input and/or output device(s) on a board that you made.

Learning Process

Interfaces

In software development, an interface refers to a set of rules or protocols that define how different software components or systems can communicate with each other. Interfaces act as a contract between different entities, specifying the methods, properties, and behaviors that can be accessed or implemented.

Interfaces serve as a way to achieve abstraction, allowing developers to separate the implementation details from the external functionality of a system or component. By defining interfaces, developers can create modular and reusable code, as different components can interact with each other without relying on the underlying implementation.

Application Programming

Application programming involves the development of software applications, which are programs designed to perform specific tasks or functions for end-users. Application programming focuses on writing code that defines the behavior, functionality, and user experience of an application.

When I first heard about this week, I was a little concerned. This was due to the fact that I had never attended a session on this subject before and was initially a little confused. Then, with the aid of our instructors, I began to learn this and realised that I had already used one of the applications during my input devices week. I then only understood this week’s topic. Then, as my excitement grew, I discovered this Three.js application, and I’m thrilled about the opportunity it provides. Additionally, there is always the potential for mobile apps, which we will be able to test out on some simple platforms like MIT App Inventor, among others.

I therefore made the decision to give Three.js and MIT App Inventor a go this week.

Individual Assignment

Processing

Processing is an open-source programming language and development environment specifically designed for visual arts, creative coding, and multimedia projects. It is built on top of Java and provides a simplified syntax and an intuitive interface, making it accessible for artists, designers, and beginners to create interactive applications and visual experiences.

Processing provides a simple and beginner-friendly environment to learn programming concepts while creating visually engaging applications. It has an active community, extensive documentation, and a vast collection of user-contributed libraries, examples, and tutorials.

101

Link to Processing Application on Input Device Week.

Three.js

Three.js is a JavaScript library for creating 3D graphics and interactive web-based applications. It is built on top of WebGL, a web standard for rendering 3D graphics in a browser, and provides a simplified and high-level API for working with 3D objects, materials, lighting, and animation.

Some Key features of Three.js:

1.3D Visualization

2.Interactive Web Applications

3.Virtual Reality (VR) and Augmented Reality (AR)

4.Physics Simulation

5.Particle Systems

6.Data Visualization

7.Game Development

Out of this I was more fascinated on the 3D visualization, VR & AR.

I tried a very basic one with the help on the examples available on the documentation of Three.js

A Small video on the animation of a cube on web

The code I used was given below:

HTML Code:

<!DOCTYPE html>
<html lang="en">
    <head>
        <meta charset="utf-8">
        <title>My first three.js app</title>
        <style>
            body { margin: 0; }
        </style>
        <script async src="https://unpkg.com/es-module-shims@1.6.3/dist/es-module-shims.js"></script>

        <script type="importmap">
          {
            "imports": {
              "three": "https://unpkg.com/three@v0.152.2/build/three.module.js",
              "three/addons/": "https://unpkg.com/three@v0.152.2/examples/jsm/"
            }
          }
        </script>
    </head>
    <body>
        <script type="module" src="/main.js"></script>
    </body>
</html>

JavaScript Code:

import * as THREE from 'three';

const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, .1, 1000 );

const renderer = new THREE.WebGLRenderer();
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );

const geometry = new THREE.BoxGeometry( 1, 1, 1 );
const material = new THREE.MeshBasicMaterial( { color: 0x00ff00 } );
const cube = new THREE.Mesh( geometry, material );
scene.add( cube );

camera.position.z = 5;

function animate() {

    requestAnimationFrame( animate );

    cube.rotation.x += 0.01;
    cube.rotation.y += 0.01;

    renderer.render( scene, camera );

}

animate();

MIT App Inventor

After all of these appetisers, the real fun begins. I’m going to use MIT App Inventor to create an app that will let me manage my Neopixel LED through a board I built with a SAMD11C chip for my output devices week.

MIT App Inventor is a visual development environment that allows individuals, including those without prior programming experience, to create mobile applications for Android devices. It provides a drag-and-drop interface, where users can visually design the user interface and program the behavior of their app using blocks-based programming.

To get started with MIT App Inventor, you can access it online through the official MIT App Inventor website and create an account. The website provides extensive documentation, tutorials, and sample projects to guide users through the app development process.

Only draw back I found was it does’nt support iOS.

First Go to MIT App website and create a new account:

102

Once the account is created, Go to Projects - Start new project

103

New Project is ready for our Trial. The window consists of 6 parts

104

1.Palette: It refers to the collection of components and blocks that can be used to design and program your mobile application. The Palette is located on the left side of the MIT App Inventor interface, and it provides a wide range of components that you can drag and drop onto the design canvas to build the user interface of your app.

2.Top Bar: The top bar contains several important elements that are used to manage screens and navigate through your app’s interface. It contains Design/Blocks Buttons, Screen Selector, Add Screen Button etc.

3.Components: This shows the components we added on to the App. We can rename it or delete those from here.

4.Media: In this section we add/delete images.

5.Properties: Here we can edit properties of each component we selected.

6.Mobile screen: This is our App. Here we can see the outcome of how app looks like. We can drag and drop components here. Arrange it for a nice UI experience.

I created an App controlling my Noepixel LED as mentioned earlier.

105

It contains:

1.A logo on top, which I uploaded Fablab Logo via Media section.

2.Given a Label to show what to do.

3.Added three buttons, RED, GREEN & BLUE to control the LED.

4.Added 2 more buttons to connect/disconnect via serial communication.

5.Checkbox for showing whether the mobile is connected via OTG serial cable.

6.This section is non-visible, which shows the connectivity options etc. Here in our case, its serial communication.

Once we design the App, now we can program this via Blocks section.

106

Before getting into more on programming, I want to connect my device to the app. So I made the blocks for Serial connection only for the time being. Now we can install the app to any Andriod phone. For that there are 2 options. One is to Build the app. But here every time we make changes, we need to uninstall the app from phone and re-build the app here in App Inventor and install it again on mobile phone.

107

Second option we can use is MIT’s Emulator App. Here we dont need to install the app, but can use all the functions by just scanning the QR code generated by the AI2 emulator.

108

Since this was my first time on an App development, I just build the app and installed in my Andriod phone.

Unfortunately, it wasn’t functioning on my phone. A serial communication connection error occurred, and the cause could not be determined. I tested it on a different phone, and the same issue appeared there as well. I tried to resolve this problem in a variety of ways, including Chatgpt, but I always failed.

After all my efforts, my tutor told me that this problem might have other variables and require additional time to be solved. Therefore, I made the decision to move forward with a bluetooth module that is already in our inventory rather than spending more time in this. May be I can come back on this issue after completing this week’s assignment.

We have an HC-06 bluetooth module which can be connected to my Output devices week board.

109

Since my plan was changed, I need to re-design my App. I made a new App for showing the reading of the Ultrasonic sensor. In this I added 1 List picker, 1 Button & 2 Labels. Listpicker is for connecting bluetooth and button for Activating the sensor. And I also added bluetooth connectivity and clock which are invisible sections.

110

Now we will use block to program each components.

111

Firstly we will add the listpicker to connect the device. Then the clock which shows us whether the device is connected or not. And last the button to read the values from the board.

Now we can Build and install the app or use AI companion emulator to fucntion the app in any Andriod phones.

But before that we need tp program the chip for the ultrasoni and bluetooth module. Since this same board was used in the input week the code already have the ultrasonic sensor readings. Now only we need to add the serial communication for bluetooth module. With the help of my tutor I made thsi corrections. The code now used is shown below:

#include <SoftwareSerial.h>

SoftwareSerial mySerial(11, 12); // RX, TX

const int echoPin = 15;
const int trigPin = 14;
long duration, distance;

void setup() {
  Serial.begin(9600);
  mySerial.begin(9600);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
}

void loop() {

  // Write a pulse to the HC-SR04 Trigger Pin
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  // Measure the response from the HC-SR04 Echo Pin
  duration = pulseIn(echoPin, HIGH);

  // Determine distance from duration
  // Use 343 metres per second as speed of sound
  distance= duration*0.034/2;

  Serial.print(distance);
  Serial.println(" cm");
  delay(500);
}

Which is uploaded to the chip using a programmer like in the Input Week

The bluetooth module connected to the board.

112

After that, everything is set up. The app can be simulated on any Android device. To do that, we must first download and install the MIT AI companion app on our phones. Then, when we choose Connect - AI companion in the App inventor screen, a QR code will appear on the screen. We must launch the MIT AI app on our phone and use that app to scan the QR code. When it has loaded, we can see our app on a mobile device and it is ready to use.

The Bluetooth module must initially be saved to the phone before we can select it from the list of saved modules by selecting the Connect Bluetooth listpicker. The device is now connected. When the activate sensor button is pressed, the label will display the object’s measurement in centimetres.

Showing a small Video how it functions.

First we need to save the bluetooth module to the mobile and then by clicking the listpicker, ie. Connect Bluetooth will show us the saved modules and we can select our module from this. Now the device is connected. Once the activate sensor button is pressed the measurement to the object will be visible in cm on the Label.

And Vola. Thats the end of our topics. Now comes the big part. THE PROJECT. Lets do it.

Hero Shot

heroshot

Group Assignment

Detailed Study Report on our Group Assignment Page.



Downloads

Download App here



Help Taken & Other References

Chat GPT used for doubt clearing and content helps.

Ultrasonic Module HC-SR04

HC-05 Bluetooth Module Interfacing with Arduino UNO

Back to top