BESO took a huge step towards innovation using Google Cloud BigQuery and Machine Learning tools
BESO Consulting and their IT department were looking for efficiencies in terms of cost and analytics as well as innovation tools with the use of Artificial Intelligence. BESO had one data lake in a hybrid architecture that required a flexible and scalable platform in addition to all the functionalities or their current processes. Before GCP, BESO had been using Oracle solutions for many years, but with the actual demands for research and data products their current platform was complex in terms of ad new integrations, connectivity, and external APIs used by their analytics team.
Leopoldo Daniel Alarcón Romero, Data Scientist Lead from BESO, started to work with Nubosof’s certified Cloud Architects and Data Engineers aiming at using
BigQuery as the principal data processing and storage solution, AI Platform Notebooks as a transforming and evaluating platform, and GCP Artificial Intelligence API’s in order to get better, faster, cost-effective results. We design a 4 step process that make our work-flow efficient:
- Data arrives at Oracle solutions and it is moved to BigQuery by internal scripts.
- The Data is used, transformed, and processed with a combination of Jupyter Notebooks and Artificial Intelligence API’s such as Google Vision API.
- The results are loaded into the BigQuery to be visualized by their Business Intelligence Dashboard (Tableau).
- The visualized data is analyzed through friendly dashboards previously designed by the Data analytics Team.
After great success, our client achieved a cost reduction of +50% due to easier, dynamic, and faster data management and the integrated use of Google Cloud Platform. Now, they are giving more value to their internal/external clients, spending their time and resources tracking and delivering valuable data instead of managing the infrastructure, growing faster without sacrificing the security or integrity of their data.