Our client is a European company in the surveillance space. They have been in the security and surveillance space for more than 15 years.
Our client came to us with a challenge for a smart city project in a major European city. They were developing a surveillance system that counted the number of vehicles and pedestrians in certain roundabouts in the city. They wanted to track the movement of the vehicles between specific lanes.
Though object tracking is a well addressed problem in computer vision, the tracking of vehicles between lanes was challenging and they were looking for some quality training data to improve the results of their models.
The dataset was just too large for our client to manage inhouse and the option of coordinating with freelancers spread around the globe was not something they wanted to venture into.
Since they didn’t plan on outsourcing this, they did not have any proper guidelines or instruction documents for us to follow. We had to build everything from scratch. We discussed with our clients to understand their requirements in terms of what they wanted to achieve with their model.
Our team started working on the data and after a few back and forth questions and answers we were able to come up with a bare bones version of the guideline document.
Using this document, our annotation team was up and running in a matter of days. We had enough clarifications during the first 3 days that helped us really hone the document into a very detailed labeling instruction manual.
Armed with this document, we were able to put together a team of labelers that could deliver about 1500 frames of annotated images per day which was enough for our client to get on with the first phase of the project.
Though this was a project that we put together on a very short notice, quality was of enormous importance to the performance of the model.
Our standard practice of having multiple quality checks during and after the labeling proved to be of enormous value. Within a few day’s time we were able to weed out all the lapses in the annotation process that led to the mistakes. Our quick feedback loop system helped us achieve this. This is also how we could build an instruction document quickly.
We followed this up with bi-weekly calls with the client on the quality and speed of delivery plus to find space for improvement.
We were able to deliver results in a timely manner for our client that helped them improve their models. We were able to quickly put together a guideline document and assemble a team that executed the requirements thanks to our team and refined and defined annotation processes.
Case study on how we provided image annotation services for an autonomous robotics startup
We helped a sports analytics leader in developing AI models for sports analytics.
Important things to look for when choosing a data annotation partner
Case study on how we used Human in the Loop for data curation
Case study on how we used Human in the Loop for a document processing use case
Check how we helped a client in increasing the performance of their models.
Contact us with your requirements and we will set up a team to wok on your free pilot project. No commitments on your side.