A Safety Reporting Prototype
We just posted a prototype safety reporting project summary on our QuickStartrm site. Using PiLR we built a first generation prototype of a mobile application that can help a company document potential safety issues in the workplace.
The prototype makes use of pictures to capture primary documentation about the situation. In addition, it provides the ability to annotate the picture using the speech to text capabilities of the phone. This is an easy way for the reporter to highlight information. Finally, the application can also provide the ability for structured questions. This gives your reporter a way to convey severity or urgency.
The camera and microphone are not the only sensors the application can use. In our prototype we capture gps information and time/date information as well. You can know where and when the report was made. This allows you to shorten the reporters effort, and to verify that the report was actually made from one of your locations.
This information is transferred from the phone to the PiLR MBaaS in real-time. Using the information captured from the phone we created a simple set of reports using R and the R-API provided by PiLR. This makes it possible to have real-time reporting. The reporting application provides 3 views:
- A map view that can function as a dashboard
- A summary report view that allows you to see all reported issues and allows for sorting and filtering
- A detailed report that combines the image and other captured data that an investigator might use to review the report.
The map view also functions as a dashboard. The blue pins represent your locations. If you click on the pin, you can see the location name (and other information if you want). The green circle with a number that is located near a pin indicates the number of issues that have been reported from that location. A pin with no circle is a location with no reports. Finally, the blue circle represents a report that was submitted but does not map to one of your locations. This become a very quick way to see how your locations are doing.
In the interest of brevity, we will jump to the detailed issue page. This view combines all of the information submitted by the reporter. Pictures are a powerful documentation tool. This combined with the text and question answers make it easy for investigators to understand what happened. Since they were captured in real-time, they are not subject to any recall bias. Finally, these materials may be good to have if it becomes necessary to go back to the reporter to clarify questions that may have come up.
PiLR adds value to the process by making the development and deployment of the mobile application very easy and fast. The total effort to create this prototype was less than 25 hours. Of this, 80% of the time was taken for the data analysis and visualizations. All the content and rules that create the mobile app were completed in about 4 hours. This yields a mobile app that is deployed in iTunes and the Google Play stores. All of the data and metadata are automatically generated. What's nice is that all of the streams have common notions of identity and time. The common infrastructure also makes it easier to combine data from different sources. As an example, it is easy to couple sensor data about activity with user observations about how they are feeling as both streams have common notions of time. And of course the system provides the type of security you would expect for these types of data.
Interested in learning more about our prototype? Visit the QuickStartRM site.