Our hawker centre dilemma: “Must return trays meh?”

Team Hardcode (SMU-X)
11 min readNov 26, 2020

--

We’ve always been taught in public schools to clear our receptacles after eating — but once we’re out, do we, honestly? After all, there are cleaners looking after our hawker centres. If no one’s clearing up, why should we spoil the market?

A place for laughter, catch-ups and tummy-filling. Credits: Straits Times (ST)

There’s a bit of talk in town recently on Singapore’s bid to put our hawker culture into UNESCO’s Representative List of the Intangible Cultural Heritage of Humanity.

Our hawker centres — from the world’s cheapest Michelin-starred meal, the whiff of wok hei, to the hubbub that is all uniquely Singaporean; public hygiene is a priority to uphold the squeaky clean standards of Singapore.

That’s why, with the help of the Ministry of Sustainability and the Environment (MSE), National Environment Agency (NEA), and the CS462: Internet of Things (IoT) faculty at SMU-X, we were tasked with using IoT to come up with a proof-of-concept on collecting insights to improve the effectiveness of future tray self-return campaigns.

Problem Statement

We are overly reliant on efficient cleaning services… One in 3 respondents believed that it is the cleaners’ responsibility to return trays. — Ms Grace Fu, Minister for Sustainability and the Environment

One in three. That means we’re not the only ones right?

But to spark a change, we have to ask ourselves: why?

And how do we find out?

Sample of a poster seen at hawker centres in 2020. Credits: NEA

Over the years, NEA has launched several campaigns aimed to encourage tray self-returns (includes cutleries, receptacles and trays). Albeit sufficient, there is still a need to move forward to digitalise and improve the processes of data collection.

In other words, to empower future campaigns with a data-driven approach through technology.

Current Practices

An SG Clean Ambassador in action. Credits: NEA

SG Clean Ambassadors are deployed to hawker centres to encourage patrons to return their trays. Surveyors, too, head down on-site to manually count tray-return rates.

That’s 114 hawker centres nationwide.

In the digital age, could these staff be redeployed to perform other duties while we let technology handle the menial tasks?

Stakeholders

We’ve identified two groups of stakeholders — MSE/NEA & estate managers.

Firstly, using IoT provides a more robust and consolidated real-time data for analytics. Descriptive data generated will provide insights on patrons’ behavioural patterns — enabling MSE and NEA to design more effective campaigns going forwards.

A messy affair: cleaners have to clear up after patrons. Credits: ST

Next, for the estate management, they will be able to utilise the real-time data from our sensors to strategise the deployment of cleaners across the hawker centre for efficient cleaning processes.

IoT Solution Concept & Implementation

Our IoT proof-of-concept was deployed at Beo Crescent Food Centre, and consists of two methods providing real-time tray return behaviour data:

  1. Method A — Detecting Tray In/Out. We’ve incorporated RFID, motion sensors and a Force Sensitivity Sensor (FSR) to detect trays leaving a stall (tray-out); as well as trays into the cleaner’s trolley (tray in).
  2. Method B — Tablevision. We’ve used a camera and a custom Machine-Learning model in the Cloud to automate object detection.

Data is then processed on the cloud and the insights are visualised on a dashboard for our stakeholders.

Dashboard layout for Tray In/Out

Method A: Detecting Tray In/Out

The main design consideration is to make our solution portable and obstruction-free for the hawkers’ operations.

We can monitor the number of tray returns to the cleaners’ trolley, which indicates a non self-return. To illustrate, let’s look at the below equation:

Trays leaving stall = self-returns + cleaner returns

Using the above equation, we can label self-returns as positive tray-return rates, and cleaner returns as negative tray-return rates.

Tray-Out

Our glorious GIF of the Tray Out solution

To detect tray-out from the hawker stalls, we used the FSR and RFID Reader together with our Raspberry Pi (“Pi”). We placed our sensors on an acrylic board, which detects when a tray is placed and removed.

This is our code output when a tray is placed atop the board:

  • Tray placed on board — FSR value changes from 0 to 1.
  • Food placed on the tray; the weight pushes the tray down — RFID value changes from 0 to 1.
  • Once the tray is removed from the board, add a tray-out count into the database.

Our aim is to provide an uncomplicated and obstruction-free set up for the hawker while collecting data. We used velcro to secure our Pi and sensor board for the hawker to easily setup and dismantle daily.

We made our solution water and heat resistant to withstand hot soup spillage from the hawker stall.

It can handle more than just a splash!

Tray-In

Tray In setup and data collection process flow

To detect the number of trays cleared by the cleaners into the trolley, we used the FSR and Motion Sensor together with our Raspberry Pi.

The FSR detects when there is a tray placed on the board, allowing us to track the initial tray-in count. For subsequent counts, the motion sensor is used to detect trays placed into the trolley.

The FSR also helps to track when the cleaner removes the stack of trays collected from the trolley back to the stall.

P.S. view our Tray In setup on YouTube.

Data Processing & Visualisation

An overview architecture for Detecting Tray In/Out
Our data on MongoDB

Data Accuracy

Ground Truth vs Sensor Data Accuracy

To calculate percentage error, we used the Percent Error Formula (#3):

Data comparison between sensor data and MSE/NEA data
A statistical test to find the accuracy of our sample size data

To learn more about Method A, click here.

Method B: Tablevision

We used Google Cloud AutoML for machine learning (ML) and object detection APIs to rapidly prototype our proof-of-concept. Then, we process the results of the object detection to generate self-returns or cleaner-returns data.

We used a series of images that we captured from our camera module for the custom machine learning model. This allows us to easily tag images and detect objects accurately based on our classification labels of what:

  • a “Person” is
  • Crockeries” on the table, and;
  • Trays” on the table

A quick animation of how we trained our ML model:

Look, Ma! I’m a Mechanical Turk.

Session States

We used table states to detect each session and the activities. To differentiate the behaviour, we could use:

  • Positive tray-returns: Table state 0 -> 2 -> 0
  • Negative tray-returns: Table state 0 -> 2 -> 1 -> 0

Data Processing & Visualisation

A simple overview of Tablevision’s architecture

The Pi is used as the gateway node to send image data to our Processor, which processes it and sends the table state data to our MongoDB.

An example of a session below:

Notice the states value indicating a positive tray-return.

Data Accuracy

Ground Truth vs Sensor Data Accuracy
Statistical test to find the accuracy of our sample size data

To learn more about Method B, click here.

Sensor Systems Live Status

Telegram bot to track our sensors’ heartbeat

We implemented a “heartbeat” bot to ensure that our systems are live. This sends updates to our Telegram chat every hour.

Solution Comparison

Breakdown on both methods’ pros and cons

In terms of data accuracy, both methods are similar. However, Tablevision is better in terms of potential features and ease-of-use (see comparison above).

Another point of concern for Tray In/Out is getting stall owners to setup the necessary sensors in their store.

Cost Breakdown Analysis (CBA)

Cost breakdown for both methods

Both methods are similar in terms of initial setup costs. One could save more with better cameras to cover a larger viewing angle; allowing for lesser gateways (each Pi can accommodate up to four cameras).

If using the custom AutoML object detection API, the costs for Tablevision will be higher in the long run. Here’s how to deploy on Cloud Run to save on costs.

Setup

Even though we managed to tackle water resistance on most sensors, all Pis are still susceptible to water damage.

The acrylic board is also subjected to wear and tear. Considerations for maintenance efforts need to be discussed.

In conclusion, both methods are viable options to consider; and it depends on the use-case and considerations.

However, for Tablevision to be cost effective, a fixed internet plan is needed. Furthermore, the object detection API needs to be moved from AutoML to Cloud Run.

Without these two considerations, the daily costs to run Tablevision is high.

Challenges and Insights

Challenges

Beo Crescent Market Food Centre’s floor plan

Due to the layout of the hawker center, we’ve only managed to deploy one mobile hotspot, which was placed around Zone 2. Hence, the sensors at the trolley could be disconnected if the cleaner moves to another Zone (e.g. 3 & 4).

To prevent this, we sought the cleaning supervisor’s assistance to restrict the cleaners’ movement of the trolley within the designated area to prevent data inconsistency.

We also had several non-technical challenges. Since our Tray-In solution is deployed on the trolley, we have to use portable chargers as its power source. This meant that it has to be replaced daily due to the limited capacity.

As we don’t live nearby, we’re only able to replace the portable charger and start the Tray-In data collection at 9am — a couple of hours after the stall has opened.

Key Insights

Overview analysis including Team 6 (collaboration) data

Our team gathered two main insights with the data collected over 5 days.

The dashboard feature allows us to view tray return rates for specific periods of time. From the data we’ve collected, it suggests that there’s an increase in negative tray return rate during lunch peak hours (11am — 2pm).

This trend is consistent in both our solution as well as the data collected from our collaboration with Team 6 (G6).

Secondly, as Tablevision is able to capture the number of trays on the table, we have noticed that a higher number of trays (3–4 trays) results in patrons being less likely to clear the tables on their own.

Below is the distribution frequency for the positive & negative tray return rates:

Tablevision tray return rate data

These insights, together with our dashboard, can solve our problem statement of collecting accurate data for MSE/NEA.

Limitations

Let’s look into the limitations for Method A.

Firstly, our solution is not conclusive as it is just a proof-of-concept — thus our data collection only reflects a sector in the hawker as we’ve only installed our solution on one of the few trolleys in the hawker centre.

To mitigate this limitation, we choose the trolley situated close to our hawker stall.

Secondly, the low-grade motion sensor deployed is sensitive to the surrounding movements, which was not ideal as we only wanted to focus on the motion within the trolley.

In addition, any sudden movements to the trolley will result in false counts. To minimise inaccurate readings, we used a black tape to limit the range of the motion sensor.

For Method B, it is a challenge to detect immediate swap of patrons — as seen in the illustration below:

Ah, I “chope” the table already.

In such instances — as our camera only sends image data every ~2–3 seconds to save on data usage, the camera is not able to detect said scenario.

This might result in data collection inaccuracies. You can refer to our documentation here to see how we could mitigate this issue.

In all, MSE/NEA found our solution innovative and it provides accurate data which will be useful for them to design a more effective campaign.

Here’s some feedback on our solution from MSE/NEA:

“Critical thinking skills manifested through the 2 methods of measuring tray return rate. Outcomes measured against current records. Very Good.”

“Appreciate how you guys recognise the limitations to the setup and software. The tablevision is also pretty non-intrusive. The data collected was also similar to the dataset which we have which is rather accurate.”

Interviewing Beo Crescent Hawker Centre site supervisor

We conducted an interview with the site supervisor of Beo Crescent, Mr Tan. In summary, he mentioned that :

  • The setup is not disruptive to the cleaner’s operations.
  • The idea is very interesting — it will be great to scale across the entire hawker centre to gather more data.

Lessons Learnt

Besides being more conscious of our tray-return habits, here are some lessons we’ve picked up over the three months:

Simple approach

With a wide variety of sensors to choose from, we found out that simple sensors — cameras, the FSR, RFID readers and motion sensors, implemented in the right way can tackle the problem. Sophisticated sensors might complicate the solution.

Making “Things” Easier

With the help of IoT, manual work can be reduced and data collection can be made easier with systems in place. Data processing can flow from the sensor to the Cloud for processing and display in real-time on a dashboard.

Expectations vs Reality

Things (pun intended) do not always go according to plan when brainstorming. Our solution needed to be able to adapt to the hawker centre’s harsh environment and we could only do so by being on the ground to understand the situation at the hawker centre.

Through this, we also learnt that agility and improvisation goes a long way in times of ambiguity and uncertainty.

We’d like to sincerely thank Nan Yuan Teochew Fishball Noodle stall and Mr Tan for their kind hospitality during our project implementation at Beo Crescent.

To see how our solution works at Beo Crescent Market Food Centre:

  • View our video here.

P.S. For readers who are interested to view our solution GitHub repository, click here. To view our detailed journal, click here.

--

--

Team Hardcode (SMU-X)
Team Hardcode (SMU-X)

Written by Team Hardcode (SMU-X)

0 Followers

Hi, we’re Team Hardcode (G7) from the Singapore Management University (SIS). Our team consists of Emmanuel, Jack, Kelvin, Weng Kiat, Xin Wei and Zi Xiang.

No responses yet