Baltic Apprenticeships Saves Over £2k Per Month Through Better Data Infrastructure
Harry, Senior Business Intelligence Manager, was struggling with manual processes and fragmented data sources. The solution didn’t come from a new hire, but from within his existing team.
Businesses often chase insight when what they really need is infrastructure. The value of business insights hinges on how clean and reliable its data is, and that all starts with a robust data pipeline. That’s where Data Engineers come in.
With data sources multiplying, cloud technologies advancing, and AI reshaping expectations, organisations can no longer afford to overlook the importance of data engineering. Despite often sitting behind the scenes, Data Engineers are the architects of modern data operations, and their role directly influences every real-time dashboard, every automated alert, and every strategic business decision.
At Baltic Apprenticeships, the journey toward a mature, infrastructure-led data capability didn’t happen overnight. Like many growing organisations, the data department started small, with a small team of Analysts focused on reporting and BI needs, which has gradually grown as a data function as the business scaled.
Business Challenges: Manual Processes & Lack of Cloud Infrastructure
Despite having a capable team producing regular reports and valuable insights, Senior Business Intelligence Manager Harry Hobbs knew something was missing. Data maturity isn’t a one-time milestone; it’s an ongoing process of strengthening capabilities, rethinking roles, and investing in infrastructure to meet changing business needs. When Harry looked across his team, he saw a growing problem; despite having capable Analysts, the team was bogged down by manual processes and fragmented data workflows.
“Our data wasn’t stored in cloud infrastructure or databases – it was pulled directly from systems like Salesforce. We were relying on manual processes and isolated CSVs, which were inefficient and limited our capability.”
The symptoms were familiar: dashboards that refreshed just once a day, reactive reporting cycles, and increasingly time-consuming data prep. What once worked for a smaller, less complex data landscape was quickly becoming unsustainable.
“We used to be focused on dashboards alone, with no awareness of the infrastructure needed underneath, which led to slow performance, poor governance, and limited insight.”
Manual script execution, updating Salesforce placement statuses, and ensuring consistency across systems consumed around 110 working hours a month, shared across the Data, Admin, and Sales teams.
As the volume and complexity of data increased, the team found themselves spending more time cleaning and preparing datasets than analysing them. The Analyst-only model had reached its limits, and Harry knew that if the business wanted to move from basic reporting to truly using data for strategic decision-making, something had to change.
The Shift from Analysis to Engineering
For Harry, the answer didn’t come in the form of a new hire; it came from within the team. One Analyst stood out: Ben, a natural problem-solver with a sharp eye for the mechanics behind the numbers. Without prompting, he regularly went beyond his Analyst role; he was questioning data structure, tracing data sources, and considering how the entire data ecosystem could be improved.
“Ben showed a genuine passion for data quality, system structure, and governance. He naturally gravitated toward fixing underlying data issues. It was clear his strengths aligned with engineering.”
Recognising this potential, Harry made a strategic decision: rather than recruit externally, he would invest in developing Ben’s capability through a Personal Development Plan. This offered the structured training, technical depth, and real-world application Ben needed to evolve from Analyst to Engineer and, crucially, to start building the infrastructure the business lacked.
At the time, Harry’s team had no formal data architecture. Just a few basic Python scripts and a growing patchwork of CSVs stored in OneDrive.
“He built everything from scratch. We had no databases or Lakehouse. Ben introduced Azure SQL, the lake, the warehouse, automated pipelines, and bidirectional syncs between systems.”
Through 8 pipelines automating over 390 data activities in a 24-hour period, including learner management and CRM data tables, bespoke dashboards like OTJ reporting, and even Salesforce territory updates, Ben dramatically increased throughput across the full data stack.
Previously, 137 Tableau dashboards were manually refreshed daily. Ben automated this to update four times daily, improving accuracy and efficiency, and 16 formerly manual scripts, each taking up to 5 minutes, are now fully automated.
This wasn’t just a technical upgrade; it was a fundamental shift in how the team operated. Ben’s work laid the foundation for live reporting, consolidated datasets, and a single source of truth. Dashboards that were once refreshed once per day are now updated every 20–30 minutes, giving stakeholders and the wider team real-time visibility and the ability to spot mistakes and respond faster.
“Dashboards are now refreshed within 20–30 minutes instead of daily, which means errors get spotted and fixed faster. With less time spent prepping data and more time validating it, the overall quality has improved.”
What Ben brought to the team wasn’t just code; it was confidence. Analysts could focus on generating insights rather than preparing data, and Managers and Senior Leaders could trust that the numbers were accurate, up-to-date, and complete.
Operational Results & Return on Investment
Ben’s progression from Analyst to Data Engineer didn’t just solve a technical problem; it redefined what was possible for Baltic’s entire data function. With the right infrastructure in place, the team was no longer limited by fragmented systems or manual workarounds. Instead, they had a foundation built for scale, speed, and strategic value.
It was a clear example of how data engineering empowers Analysts, removing the burden of constant data preparation and allowing them to focus on the high-value work of generating insights and driving decision-making.
Automation was introduced across multiple processes, from data entry and reporting to real-time operational updates, and what had once required hours of manual effort could now happen instantly, seamlessly feeding insight back into the business.
The results were substantial:
- 110 working hours saved per month across multiple teams
- £2,146.10 saved per month due to automation
- 23 manual processes completely eliminated
- 60% reduction in timeliness-related reporting errors
The automation of daily database and script refreshes improved the reliability of OTJ training records, a critical component for evidencing funding claims to the ESFA. Meanwhile, automated updates in Salesforce ensured sales and account managers had real-time learner status, eliminating delays caused by manual updates or staff absence.
“We can now automate responses to customer feedback, update internal systems in real time, and provide far stronger customer experience and internal efficiency.”
Most importantly, this wasn’t seen as a one-off transformation; it marked a new chapter in Baltic’s data journey. With a dedicated Data Engineer in place and the right structures underpinning their analytics, Harry’s team no longer just reported on the business; they were actively engineering smarter, faster, and more data-informed ways of working.
Advice to Leaders
For business leaders still weighing up whether to invest in a dedicated Data Engineer, Harry encourages leaders to be proactive, not reactive: invest early, build strong foundations, and empower their teams to move forward with confidence.
“It might feel like a luxury if you’ve never had one, but once you do, it’s hard to imagine how you ever managed without. Robust infrastructure needs dedicated skills and time. It’s not a cherry-on-top – it’s a foundational requirement for a modern data team.”
As cloud technologies become standard and AI moves from innovation to expectation, the importance of clean, structured, and well-managed data has never been greater. Data Engineering is the backbone of operational efficiency, business intelligence, and digital transformation.
Final Thoughts
Through foresight, investment, and a commitment to nurturing internal talent, Baltic have built a scalable, intelligent foundation for insight, innovation, and agility. What began as a response to inefficiency evolved into a data function that now drives operational excellence and enables better decisions at every level of the business.