Good to meet you, I'm Yuka.
Learn more about me!
In 2020 I made a major shift in my career by learning Python and SQL. I was captivated by the innovations in data technology and motivated by a future career in the data industry. After working as a dietitian in the healthcare field for 5 years, I moved into a data analyst role to begin my new career path.
Four years later, I’ve developed the foundational skills in SQL, data analysis, and visualization. And now as a Data Engineer, I have substantial experience with data infrastructure, ETL/ELT pipelines and data integrations, database management, and automation and orchestration. I’m excited to advance my career and expand my expertise in the data world.
In oncology nutrition, there are a set of evidenced-based standards dietitians use to determine the optimal nutrient intake a cancer patient needs and must account for the disease state as well as their cancer treatment. I transferred this logic into Python code and created a custom calculator that would allow me to input some basic anthropometrics and patient info and spit out all the nutrient intakes I need to record in my nutrition assessment documentation.
View ProjectI developed a model that predicts the probability of excessive absenteeism from work. After preprocessing the data, building a logistic regression model, and loading it as a Python module, the model is used to make predictions with new data.
View ProjectI built a machine learning algorithm for a business case study with data from an Audiobook company. This is a classification problem that predicts if a customer will buy another audiobook. Python and Tensorflow 2.0 were used.
View ProjectI support a custom-built web application developed in .NET Framework (20+ years old), where I worked closely with both the front-end and back-end. My responsibilities included troubleshooting bugs in the application, updating HTML and VB.NET code to enhance usability, and writing new stored procedures and views in SQL Server to support functionality. This role helped build a strong foundation in debugging, legacy code maintenance, and SQL development, while also reinforcing the importance of clean, scalable code and clear communication across teams.
I support and maintain a custom-built Python application that interfaces with the Wells Fargo Account Validation API to ensure secure and accurate verification of customer bank account information. The main goal is to reduce the risk of payment failures and fraud. My responsibilities span the full lifecycle of this locally installed tool. I manage the application’s integration with Wells Fargo’s secure API, including handling mutual TLS authentication by renewing and installing client-side certificates. I implement updates to the Python-based "Logic Box,” as internal processes change. Additionally, I oversee installation and configuration of the application on end-user machines, ensuring a seamless experience for staff.
I integrated wage deduction data from the company’s HR vendor to the on-prem HR database (SQL Server). I designed and implemented a new ELT pipeline, incorporating PowerShell scripting to automate the SFTP connection and data extract. An SSIS package was implemented for loading and transforming the data into the HR database. A complex SQL SP was created for the reporting and analytics that the executive team needed for analyzing the cost of company paid benefits for each employee.
I collaborated with the Lawson consultants and SQL DBA to upgrade and migrate SQL Server and the Lawson application. It involved reconfiguring multiple data integrations and connections between Lawson and other finance and accounting systems. I also designed and implemented a new role and security class based system in Lawson Security Administrator to cleanup an outdated and cluttered security setup.
I oversaw and managed the migration of the on-premise TM1 database to the cloud platform. The migration included configuring the IBM Satellite Connector for establishing secure connections between IBM Cloud and on-premise data sources.
Implemented a new ELT pipeline for CRM data into the Finance’s TM1 database involved writing code in MDX in order to utilize the TM1 TurboIntegrator tool. I created a new TurboIntegrator Process for the automation of data extraction, transformation, and load to the TM1 cubes and dimensions.
November 2022 - Present
Lead the design and implementation of scalable ETL/ELT pipelines to process donor/funding, employee, and payroll data
Build and maintain database schemas to support advanced analytics and reporting for HR, Finance, and Accounting teams
Collaborate with leadership teams to define data requirements and deliver analytics-ready datasets
November 2021 - November 2022
Created complex SQL queries to extract, clean, and analyze data, including combining data from multiple databases
Designed and maintained reports and visualizations for the finance and accounting teams
Led the database management and data integrations from internal and external data sources
June 2019 - November 2021
Supported four surgeons in the Hepato-Pancreato-Biliary Surgery clinic, specializing in providing nutrition care for patients after surgery
Supported three GI Medical Oncology clinics, providing nutrition care for patients undergoing chemotherapy or chemoradiation
September 2016 - May 2019
Clinical inpatient dietitian at Kaweah Delta Medical Center, providing nutrition therapy for critically ill patients in ICU, Cardiovascular-ICU, and Oncology units
Outpatient oncology dietitian at Sequoia Regional Cancer Center-Radiation Oncology, specializing in enteral and parenteral nutrition support for patients undergoing chemoradiation
Iowa State University
January - July 2016
Bastyr University
Kenmore, WA
Graduated June 2015