Mosaic embedded designs » Embedded controllers » Software development tools » Operator interface software tools

Graphical user interface (GUI) software tools

Embedded GUI programming tools implement easy to use instrument controls

Today's tech-savvy customers demand sophisticated yet easy-to-use front panel user interfaces for embedded control, also referred to as human-machine interfaces HMI or man-machine interfaces MMI. Mosaic provides software tools to help you quickly develop menu-based man-machine graphical interfaces that respond to the user's touch.

The Graphics Converter application runs on your PC and converts your custom images into graphics that can be displayed by the controller.

The GUI Builder program runs on the target controller board, allowing you to quickly design and place your buttons and graphics in a menu-based system, and it even generates the GUI source code for you!

The GUI Toolkit is the runtime engine that processes user inputs (touchscreen or keypad presses) and manages the user interface.

Taken together, these tools automate the process of designing a user interface so that you can get to market quickly.

Graphics Converter simplifies custom graphics generation

The GUI Toolkit comes with an assortment of useful graphics that implement various sized buttons, arrows, and icons. For most applications, you'll also want to create your own custom graphics.

A Graphics Converter program makes this easy to do on your Windows desktop PC. Simply create the desired graphical image using any graphical editor (such as PC Paint or Photoshop), and save it as a bitmap file.

After you've created all your bitmap images, click on the Graphics Converter icon in the Mosaic IDE, and the images are transformed into a download file that puts the graphics in the flash memory of your controller. The graphics are now ready to use.

Back to top

GUI Builder enables interactive screen design, and generates your source code

The GUI Builder lets you interactively place your buttons and images on the touchscreen. The program runs on your touchscreen-based controller, and you talk to it using the Mosaic Terminal program. You can select any of the buttons or graphics that have been loaded into flash memory, and use your finger or some arrow keys to move them on the screen to their desired locations. In this way you quickly build up the menu screens that the end user will see.

When you're pleased with the user interface design, the GUI Builder outputs source code in your selected programming language (C or Forth). All you have to do is attach an event handler routine to each button so it will perform its desired action at runtime.

Back to top

GUI Toolkit defines and manages the user interface

Using the Graphics Converter and the GUI Builder, you create and place buttons and graphics on the multiple screens that make up the user interface. Typically, an instrument will have a main screen, from which other screens are accessed. Each screen is in turn created from building blocks such as graphics images, button objects, and ASCII strings. These building blocks must be organized in an intuitive way so users can easily operate your instrument.

To simplify your programming and design of the user interface, the GUI Toolkit uses object oriented concepts to organize these building blocks. Object oriented programming allows you to organize data structures (objects) hierarchically and manipulate the data using pre-defined methods. With the GUI Toolkit, it is simple to create elementary objects such as graphics that contain bitmapped image data and textboxes that contain strings. You can load those objects into other objects such as screens so that they are shown on the display. You can create controls which acquire data from a user or actuate hardware when a user touches the touchscreen. A button is a simple control.

The GUI Toolkit allows you to create onscreen objects, control their properties, define and assign actions for them, and specify how they respond to events like a button press. You can easily create interactive buttons, graphics, and textboxes on multiple screens to implement a sophisticated yet intuitive graphical user interface.

The runtime engine of the GUI Toolkit scans the touchscreen for button presses, and activates the handler function associated with the touched button. In addition to specifying the called function, you can easily control other actions. For example, you can set a flag that sounds the audible beeper whenever the button is pressed. Or, you can provide visual feedback for the button press using a pressed button graphic. These functions are automatically handled by the GUI Toolkit so you can create a polished interface without delving into the low level details.

Back to top
Back to QED Software

Home | Site Map | Products | Documentation | Resources | Order | About Us
Copyright © 2014 Mosaic Industries, Inc.
Your source for single board computers, embedded controllers, and operator interfaces for instruments and automation

GUI Software Tools | Embedded GUI Toolkit | OEM Operator Interface Software | Touchscreen GUI