Choose a role:
Summary: I led the migration of reporting from Tableau to Power BI, reducing licensing costs while introducing faster and more flexible reporting prototypes.
S · Situation
UnitedHealth Group was heavily invested in Tableau, but licensing was expensive, and performance optimizations were not yielding significant improvements. End users were dependent on Tableau despite Microsoft 365 already providing Power BI licenses at no extra cost.
T · Task
I needed to prove the value of Power BI by quickly creating prototypes that stakeholders would accept. The challenge was to drive cultural change and overcome resistance to moving away from an established tool.
A · Action
I created rapid prototypes in Power BI to showcase interactive visuals, improved usability, and integration with existing Microsoft systems. I presented side-by-side comparisons showing both speed improvements and cost savings. I gradually migrated core reports while ensuring consistency of metrics across both platforms during the transition.
R · Result
Licensing costs were reduced by approximately $0K annually, since Power BI came bundled with Microsoft 365. Report delivery time improved by ~0%, increasing stakeholder adoption. Leadership shifted to Power BI as the primary reporting platform, establishing a long-term reporting strategy.
Stack: Power BI, SQL Server, DAX, SSIS, Tableau
Summary: I improved reporting performance at Hyatt by introducing data archiving and incremental load strategies, which reduced slowness in BI applications and improved end-user adoption.
S · Situation
Hyatt’s BI apps were suffering from slowness caused by large historical datasets. Every report query scanned years of data, making even simple dashboards slow to load. Users were dissatisfied and adoption was dropping.
T · Task
I needed to redesign the load process so reports would run faster while keeping historical data accessible. The solution had to balance speed with accuracy and not risk data integrity.
A · Action
I proposed and implemented an archiving mechanism for historical data, paired with incremental loading for new data. This allowed queries to focus only on recent periods by default, while archived data could still be accessed if needed. I also restructured indexes to reduce query time further.
R · Result
Report execution time improved by ~0%, cutting average load times nearly in half. BI adoption increased as reports became usable in daily operations. Leadership appreciated that the solution preserved access to all historical data without compromising performance.
Stack: Oracle SQL, PL/SQL, Informatica, OBIEE
Summary: I developed a regression-based cost analysis model for UnitedHealth Group that identified the top drivers of healthcare expenses and introduced new KPIs for leadership review.
S · Situation
Healthcare cost drivers were not fully understood by leadership, leading to generic strategies that did not address high-impact areas. Large datasets were collected, but insights were scattered and inconclusive. Managers were reluctant to approve new KPIs without formal validation.
T · Task
My responsibility was to analyze healthcare data, identify the most significant factors driving costs, and present actionable KPIs that could be adopted by stakeholders. I had to accomplish this without formal approval channels for new metric definitions.
A · Action
I applied regression analysis on large healthcare datasets to identify key variables affecting costs. The model highlighted age, doctor visits, and pre-existing conditions as the top three drivers. I then designed new KPIs to measure these factors and built clear reports to present them directly to leadership, even without manager approval.
R · Result
The model explained cost variance with ~0% accuracy and directly influenced budget decisions. Reporting turnaround time was reduced by 0%, allowing faster decisions. Leadership adopted the new KPIs, which improved planning accuracy and built trust in data-driven decision making.
Stack: Power BI, Statistical Analysis, Cluster Analysis
Summary: I automated a loyalty program data quality process that repeatedly caused extended downtime during month-end reporting. The solution reduced downtime drastically and introduced reliable fallback mechanisms.
S · Situation
Walgreens’ loyalty data load process was prone to repeated failures, requiring multiple reloads. During month-end, these failures caused downtime of nearly seven hours, delaying reports and frustrating business users. Teams hesitated to allow changes to standard processes, even though system reliability was suffering.
T · Task
My task was to improve load reliability, cut downtime, and deliver a rollback option to ensure business continuity. I needed to balance automation with safeguards, since leadership had concerns about unintended disruptions.
A · Action
I created an automated data quality detection and reload mechanism with backup and rollback features. The automation checked data integrity before release, reduced dependency on manual reloads, and ensured rapid recovery if needed. I also documented the process clearly so that stakeholders felt comfortable approving it without waiting for top-down approvals.
R · Result
Downtime was reduced by 0%, from seven hours to three hours during month-end. Data quality improved by ~0%, boosting stakeholder confidence. The rollback feature ensured zero business disruption, and adoption spread across other month-end processes.
Stack: Oracle SQL, Unix, MDBMS, Hyperion
Summary: Modular time-series pipelines with automated refresh and seasonality.
S · Situation
Manual, reactive scheduling across thousands of stores.
T · Task
Stabilize forecasts and reduce manual corrections.
A · Action
Feature engineering; hierarchical models; ADF triggers; staffing constraints.
R · Result
- 0% reduction in manual intervention
- Improved stability; fewer late corrections
Stack: Python, PySpark, Databricks, ADF, ADLS, SQL
Summary: I built a regression model to identify the strongest predictors of healthcare costs, providing stakeholders with clear KPIs and actionable insights that improved decision-making around cost management.
S · Situation
UnitedHealth Group faced rising healthcare costs without clear visibility into the primary drivers. Leadership relied on fragmented reporting, which made it difficult to prioritize interventions. Existing KPIs were broad and lacked specificity.
T · Task
I needed to create a data science solution that could explain cost variance across patients, identify the most significant contributing factors, and suggest new KPIs for monitoring. This required careful statistical validation to gain executive trust.
A · Action
I developed a regression model using large patient datasets. The analysis revealed that age, frequency of doctor visits, and pre-existing conditions accounted for the majority of cost variance. I created new KPIs around these drivers and visualized the findings in clear, stakeholder-friendly reports. I presented results directly to leadership, bypassing delays in KPI approvals.
R · Result
The model explained ~0% of cost variance and reduced reporting complexity by 0%. Leadership adopted the new KPIs, which improved cost forecasting accuracy by over 0%. My work influenced key budget allocation decisions and helped establish predictive modeling as part of UnitedHealth’s analytics strategy.
Stack: Python, SQL Server, SSIS, Tableau
Summary: I designed and deployed a predictive alert system within Walgreens BI applications to forecast operational issues, reducing unexpected disruptions and improving response times.
S · Situation
Walgreens experienced recurring issues in BI application data loads. Failures were usually detected only after problems escalated, resulting in delayed reporting and user frustration. The lack of proactive detection created inefficiencies across multiple business units.
T · Task
I was tasked with improving the reliability of BI applications. Initially, the scope was limited to fixing load issues, but I recognized the opportunity to introduce predictive analytics to anticipate failures before they caused downstream problems.
A · Action
I applied time-series forecasting models to historical load data to identify recurring failure patterns. I created a predictive alert mechanism that flagged likely failures and provided insights to system administrators. The solution was embedded directly into existing BI apps, ensuring seamless adoption without requiring additional platforms.
R · Result
Prediction accuracy reached ~0%, allowing teams to act on alerts proactively. Report delays were reduced by 0%, and end-user satisfaction increased significantly. The system became a model for predictive monitoring in other Walgreens applications.
Stack: Statistical Analysis, Python, PL/SQL, Unix
Summary: I implemented regression models within Experian’s Sales Insight platform to analyze customer behavior across EMEA and APAC regions, providing leadership with more accurate forecasting and improved customer segmentation.
S · Situation
Experian needed to improve sales forecasting and customer analysis across EMEA and APAC. Existing reporting tools were descriptive but failed to uncover deep relationships between customer factors and revenue outcomes.
T · Task
My task was to apply advanced analytics to customer data, identify meaningful drivers of sales, and generate insights that could be used to improve regional sales strategies.
A · Action
I built regression models that linked customer demographics, engagement frequency, and product mix to revenue outcomes. I collaborated with business stakeholders in both regions to ensure the models reflected local realities. The results were translated into actionable reports with prototypes of improved KPIs.
R · Result
The regression models improved forecast accuracy by ~0% compared to prior methods. Regional managers used the insights to refine customer targeting strategies, which led to an estimated 0% improvement in campaign ROI. Leadership adopted the regression framework as a standard part of sales insight reporting.
Stack: Python, Informatica, PL/SQL, Statistical Analysis
Summary: I leveraged Databricks to design actionable KPIs for Bombardier after analyzing stakeholder needs and aligning reporting with business OKRs, significantly improving governance and decision-making.
S · Situation
Bombardier’s teams lacked consistent KPIs and struggled with fragmented governance across data sources. Stakeholders had different interpretations of the same metrics, leading to confusion and inefficient decision-making.
T · Task
I was tasked with creating standardized KPIs that aligned with organizational OKRs and addressed the pain points of each team. I needed to balance diverse stakeholder requirements while ensuring governance compliance.
A · Action
I met with stakeholders across teams to understand reporting needs and existing gaps. Using Databricks, I processed large volumes of operational and financial data to derive consistent KPIs. I validated these with stakeholders through prototypes and refined them iteratively to ensure adoption.
R · Result
Adoption of new KPIs improved reporting accuracy by ~0%. Teams reduced metric-related disputes by over 0%, improving trust in analytics. Leadership highlighted the initiative as a cornerstone for data governance maturity.
Stack: Python, Databricks, SQL, Data Governance
Summary: I optimized Walgreens’ BI Apps load process by analyzing dependency chains and creating parallel execution strategies. This improved overall load performance and helped establish new KPIs for system monitoring.
S · Situation
Walgreens’ BI Apps suffered from long-running Out-of-the-Box loads that delayed reporting. The business required timely reporting for executive dashboards, but load inefficiencies caused frequent bottlenecks. Leadership resisted modifying vendor-delivered jobs, fearing compliance risks.
T · Task
I needed to optimize system performance while respecting Out-of-the-Box restrictions. The goal was to speed up data availability without compromising vendor support or introducing system instability.
A · Action
I performed dependency analysis on tables and fact-dimension relationships. I restructured jobs into parallel execution groups, ensuring loads were ordered logically but without unnecessary serialization. I also introduced KPIs to measure system performance, which allowed me to demonstrate consistent improvements.
R · Result
Data load completion time was reduced by ~0%, making critical dashboards available hours earlier. Reporting accuracy improved by ~0% as dependencies were better synchronized. Leadership adopted performance KPIs as part of the monitoring process, enabling sustainable improvements.
Stack: Python, PL/SQL, Oracle Data Integrator, Oracle BI Apps
Summary: I optimized Experian’s DMT to process large volumes of data faster, delivering significant improvements in migration efficiency for senior leadership teams.
S · Situation
Experian’s DMT struggled to process large daily loads, creating bottlenecks for cross-regional data migration. Leadership needed timely results to meet SLAs, but the tool’s performance was inconsistent and often caused delays.
T · Task
I was tasked with improving throughput and efficiency of DMT jobs without sacrificing data quality checks. The solution had to demonstrate tangible improvements in both processing speed and reliability.
A · Action
I tuned queries, streamlined data validation scripts, and implemented incremental batch execution instead of full reprocessing. I also automated validation checks to prevent manual bottlenecks and ensure quality at scale.
R · Result
Processing speed improved by ~0%, cutting migration times significantly. Reliability increased by ~0%, ensuring more consistent SLA compliance. Leadership recognized the optimization as a high-value contribution to EMEA/APAC reporting efficiency.
Stack: PL/SQL, Unix, Windows PowerShell
Summary: I implemented a Oracle Hyperion Data Relationship Management (DRM) server for Walgreens in just two weeks, enabling critical master data management and data governance capabilities during a business-sensitive period.
S · Situation
Walgreens urgently needed a test DRM environment to validate critical reporting changes. Leadership initially considered manual workarounds due to the tight timeline and perceived complexity of setup.
T · Task
I had to deliver a fully functional DRM test environment quickly, with minimal disruption, and without relying on manual processes.
A · Action
I leveraged my systems knowledge to configure, test, and validate a new DRM server instance. I automated portions of the setup with Windows scripting to save time. I coordinated with infrastructure teams to streamline approvals and deployment.
R · Result
The server was deployed in 0 days, ahead of the critical deadline. Testing downtime risk was reduced by ~0% compared to manual alternatives. This success built trust in automation-first approaches for infrastructure delivery.
Stack: Oracle EPM, Hyperion, Data Governance, Windows PowerShell
Summary: I redesigned Toyota’s data load process by opting for full re-extractions instead of incremental loads to prevent downstream data loss.
S · Situation
Toyota Financial Services faced frequent reporting discrepancies caused by incremental data loads missing updates. Downstream systems were impacted, and leadership was concerned about data reliability.
T · Task
I was responsible for ensuring that downstream systems received accurate and complete data, even if this required rethinking the traditional incremental load strategy.
A · Action
Drawing from past experience, I proposed full data re-extractions, ensuring complete synchronization across systems. I optimized the process to minimize runtime impacts, balancing completeness with performance.
R · Result
Data reliability improved by ~0%, with downstream discrepancies nearly eliminated. Although processing times increased slightly, stakeholder trust in reporting accuracy improved dramatically. This trade-off was accepted as the best solution.
Stack: PL/SQL, Informatica PowerCenter, Unix, DAC
Summary: I optimized Experian’s SOOSA C 2.1 daily load, reducing downtime by 70% through new process design and automation.
S · Situation
Experian’s daily load jobs were prone to failures, consuming excessive time and delaying critical reports across global regions. Leadership was pressuring the team to improve reliability quickly.
T · Task
I was tasked with redesigning the process to reduce downtime and improve load performance, ensuring faster delivery of daily reports.
A · Action
I created optimized SQL procedures, parallelized portions of the ETL process, and automated restart checkpoints. I built monitoring scripts to flag and restart failed jobs without manual intervention.
R · Result
Downtime was reduced by 0%, and daily reporting timelines improved by ~0%. Stakeholder confidence increased, and the optimized framework became the new standard for daily data processing.
Stack: PL/SQL, Informatica PowerCenter, Unix, DAC
Summary: I created a new data warehouse model for RiteAid integration within Walgreens, supporting balance, journal, and account reporting with proper governance.
S · Situation
Walgreens needed to integrate RiteAid’s reporting structure into its own warehouse. The existing models did not align with new governance requirements, leading to inconsistent reporting.
T · Task
My goal was to design a warehouse model that incorporated RiteAid’s data while maintaining consistent governance and reporting performance.
A · Action
I designed fact and dimension models covering balances, journals, and accounts. I worked with governance teams to ensure compliance and created structures that minimized redundancy. I tested models against existing reports to validate correctness.
R · Result
Integration accuracy improved by ~0%, and reporting consistency across Walgreens and RiteAid data was achieved. Performance improved by ~0% as models were streamlined for query efficiency.
Stack: PL/SQL, Oracle BI Apps, Unix, Shell Scripting
Summary: I transformed a recurring load issue into a predictive analytics opportunity, building forecasting models that enabled proactive alerts and actionable insights in Walgreens’ BI applications.
S · Situation
Walgreens BI teams faced frequent reporting delays due to data load failures. Stakeholders only requested fixes to stabilize the load, but this reactive approach left the business exposed to recurring risks. Leadership hesitated to approve predictive solutions because of concerns about model reliability.
T · Task
I had to stabilize the immediate reporting issue while exploring predictive models that could forecast both failures and business trends. The challenge was to prove value quickly and minimize perceived risk.
A · Action
I developed time-series forecasting models using Python and SQL to detect anomalies and predict future failures. I integrated predictive alerts directly into BI dashboards, enabling business users to see both risks and forward-looking metrics. I also iteratively refined models to reduce false positives, gaining confidence from end users.
R · Result
The solution reduced reporting disruptions by ~0%, preventing multiple critical outages. Business teams began using predictive dashboards to anticipate issues and plan actions. This success demonstrated that predictive analytics could be embedded directly in operational BI systems.
Stack: Statistical Analysis, PL/SQL, Python, Unix
Summary: I created machine learning–driven KPIs and OKRs for Bombardier, aligning data governance with team-level business goals.
S · Situation
Bombardier lacked standardized KPIs across departments, making it difficult for leadership to measure progress. Each team tracked metrics differently, resulting in fragmented reporting and misaligned goals.
T · Task
I was responsible for analyzing the reporting landscape, consulting with stakeholders, and defining actionable KPIs supported by ML-driven insights.
A · Action
I interviewed stakeholders across teams to understand their pain points and goals. Using Databricks, I combined data sources and built ML-based models to highlight the most impactful metrics. I then proposed new KPIs and created a framework to align them with company-wide OKRs.
R · Result
KPI adoption increased by ~0% across teams, creating a unified reporting strategy. Leadership had clearer visibility into business performance, and governance became data-driven. The initiative positioned ML-driven KPI frameworks as the standard for analytics maturity at Bombardier.
Stack: Databricks, Python, Statistical Analysis
Summary: I replaced manual Excel-based reporting at Galderma with automated pipelines and machine learning models that generated actionable insights.
S · Situation
Galderma’s reporting relied heavily on manual Excel work, which was slow, error-prone, and not scalable. Users spent hours preparing reports instead of analyzing results. Leadership sought an automation-first solution but lacked in-house ML expertise.
T · Task
I needed to automate the reporting process, integrate ML insights, and ensure reports were delivered faster and with higher accuracy.
A · Action
I built Databricks pipelines to ingest and preprocess raw data automatically. I applied ML models to generate predictive KPIs such as churn risk and sales forecasts. I connected the outputs to Power BI dashboards, ensuring stakeholders could easily consume the insights.
R · Result
Manual reporting effort was reduced by ~0%, saving dozens of hours each week. Forecasting accuracy improved by ~0%, giving leadership actionable insights. The automation freed up analysts to focus on strategy rather than manual tasks.
Stack: Databricks, Python, Statistical Analysis
Summary: I applied machine learning within Walgreens’ Field Transformation Program to balance upstream and downstream system changes, providing predictive insights that minimized risk.
S · Situation
Walgreens was undergoing a major field transformation, with tightly integrated upstream and downstream system changes. These dependencies created risks of cascading failures if not managed proactively.
T · Task
My task was to provide predictive insights to anticipate where risks could occur, ensuring system stability during transformation.
A · Action
I analyzed transformation data and built ML models to simulate impacts of system changes. By predicting high-risk scenarios, I advised teams on preventive actions. I collaborated with both governance and analytics teams to validate models against real-world outcomes.
R · Result
Transformation risks were reduced by ~0%, with fewer unplanned outages during rollout. Predictive insights improved coordination across teams and accelerated program delivery timelines by ~0%. The success showcased the value of ML in system transformation projects.
Stack: Statistical Analysis, PL/SQL, Data Governance
Summary: I proposed and delivered new KPIs for UnitedHealth Group by analyzing stakeholder requests and converting textual requirements into structured, data-driven metrics.
S · Situation
Product owners often described their needs in natural language, requesting “better insights” without clear definitions. Traditional BI processes failed to capture these requirements efficiently, leading to long delays in reporting improvements.
T · Task
I had to interpret ambiguous requirements, define precise KPIs, and build rapid prototypes that aligned with leadership’s intent, even without formal approvals.
A · Action
I used text-mining and classification techniques to parse stakeholder feedback, extracting common themes. I mapped those into quantifiable KPIs such as “member engagement by visit type” and “condition-specific cost ratios.” I then rapidly built prototypes in Python and BI tools to visualize the new KPIs.
R · Result
Stakeholder satisfaction improved by ~0%, as prototypes directly matched their intent and reduced adhoc data request counts. KPI adoption increased across multiple product lines, reducing report development cycles by ~0%. The project demonstrated how NLP-driven requirement analysis could accelerate insight creation.
Stack: Python, NLP, Statistical Analysis, Power BI
Summary: I collaborated with Galderma stakeholders to translate raw narratives about business challenges into fresh KPIs, enabling more actionable reporting.
S · Situation
Stakeholders described problems in narrative form, such as “our reporting doesn’t reflect treatment-level outcomes.” These narratives rarely translated into clear data fields, leaving gaps in reporting.
T · Task
My job was to bridge the gap between unstructured stakeholder input and structured KPI design, while ensuring dashboards reflected real-world business meaning.
A · Action
I applied text-based clustering to stakeholder inputs to group common pain points. I then transformed those clusters into structured KPIs, validated by subject matter experts. Using Databricks pipelines, I automated data collection for these KPIs and built Power BI dashboards to present them.
R · Result
KPI relevance improved by ~0%, ensuring reports answered real stakeholder concerns. Dashboard usage increased by ~0% as leaders recognized their input reflected in metrics. This effort strengthened trust in data-driven storytelling across the organization.
Stack: Databricks, Python, Power BI
Summary: I introduced new KPIs at Hyatt by mining performance review notes and transforming qualitative observations into structured metrics.
S · Situation
Hyatt’s reporting relied on traditional KPIs, but managers often documented observations in free-text fields that were never analyzed. These unstructured notes contained valuable insights but went unused.
T · Task
I needed to extract themes from text data and design KPIs that reflected operational issues, improving the richness of reporting dashboards.
A · Action
I performed text mining on review notes to identify recurring topics, such as “check-in delays” and “room readiness.” I translated these into structured KPIs, then built prototype dashboards to visualize the new metrics. I validated the KPIs with managers to ensure they aligned with observed challenges.
R · Result
Reporting coverage expanded by ~0%, incorporating previously ignored operational signals. Dashboard refresh cycles were ~0% faster since structured KPIs simplified aggregation. Leadership recognized the dashboards as more actionable and holistic.
Stack: Python, Data Mining, Dashboards
Summary: I automated data quality reporting at Walgreens by generating textual narratives alongside dashboards, helping leadership interpret issues faster.
S · Situation
Data quality reports were highly technical, full of metrics but lacking context. Business stakeholders struggled to interpret whether a flagged issue was critical or negligible. This delayed decision-making and reduced trust in reporting.
T · Task
My goal was to generate plain-language narratives that summarized data quality checks, highlighting only the most important issues and suggesting next actions.
A · Action
I built an NLP-driven module that generated short narratives for each data quality check, e.g., “Loyalty transactions dropped by 15% this week, likely due to system downtime.” I integrated these summaries into BI dashboards so that executives could read them alongside charts.
R · Result
Stakeholder comprehension improved by ~0%, measured through feedback surveys. Decision-making latency decreased by ~0%, as executives no longer had to consult technical teams for interpretation. The feature became a template for narrative reporting in other departments.
Stack: Python, NLP, Data Visualization, Power BI
Summary: I resolved data quality challenges in Walgreens’ Financial Accounting Hub (FAH) by balancing technical optimization with stakeholder concerns, building trust while delivering automation.
S · Situation
Walgreens’ FAH Out-of-the-Box load produced reporting discrepancies. Leadership resisted modifications to vendor-delivered jobs, fearing compliance risks. Business teams needed accurate reports but were frustrated with delays and quality issues.
T · Task
I was tasked with improving reporting accuracy while addressing resistance to change. I needed to deliver improvements without creating conflict or violating vendor agreements.
A · Action
I designed a complementary load process that operated in parallel, validating data quality without modifying the original vendor jobs. I framed the solution as a non-intrusive safeguard, not a replacement. I presented prototypes to stakeholders and positioned the project as risk-free, ensuring their buy-in.
R · Result
Reporting accuracy improved by ~0%. Month-end report preparation time dropped from six hours to less than two. Stakeholder adoption was strong, as the compromise respected vendor constraints while still achieving automation goals.
Stack: PL/SQL, Oracle Data Integrator, Oracle BI Apps
Summary: I led a major reporting transformation at UHG by advocating for Power BI over Tableau, overcoming resistance through prototypes and financial justification.
S · Situation
Stakeholders were committed to Tableau despite its high licensing costs and limited optimization potential. Power BI was already available through Microsoft 365 licenses but sat unused. Leadership feared disruption from a full migration.
T · Task
I needed to prove that Power BI was a viable replacement for Tableau, both technically and financially. My role was to manage stakeholder concerns, secure buy-in, and present a vision for the migration.
A · Action
I created rapid Power BI prototypes showing side-by-side comparisons with Tableau. I demonstrated improved usability, faster refresh times, and seamless integration with existing systems. I emphasized cost savings of roughly $15 per user per month, multiplied across thousands of licenses.
R · Result
Leadership approved the migration. Licensing costs dropped by ~0%, saving nearly $0K annually. Stakeholder adoption grew quickly, with report delivery time improving by ~0%. I successfully shifted organizational culture toward Power BI as the strategic reporting platform.
Stack: Power BI, Tableau, SQL Server, Python
Summary: I guided Walgreens’ reporting teams through a trade-off decision, balancing optimal technical solutions with business timelines and stakeholder priorities.
S · Situation
Walgreens requested a new cube design for MDBMS reporting. While the optimal solution was to create a fresh cube, leadership insisted on modifying existing cubes to save time. This created risk of slowness as cube size expanded.
T · Task
My responsibility was to ensure stakeholders understood the trade-offs and commit to a solution aligned with both technical feasibility and business urgency.
A · Action
I presented side-by-side comparisons of cube strategies, including estimated performance impacts. I explained that modifying existing cubes would risk slowness but respected the business timeline. I gained agreement to proceed with modifications while documenting performance risks.
R · Result
Stakeholders adopted the modified cube solution. Performance impact was partially mitigated by query optimization, reducing slowness by ~0%. Most importantly, business timelines were met without conflict, and leadership trusted me as a balanced advisor.
Stack: Oracle BI Apps, PL/SQL, Hyperion
Summary: I drove stakeholder alignment at Bombardier by consolidating fragmented reporting practices into a unified KPI governance framework.
S · Situation
Teams across Bombardier tracked metrics differently, leading to misaligned goals and conflicting reporting outputs. Leadership needed a common framework to evaluate performance but struggled to gain consensus.
T · Task
My task was to bridge team silos, identify common ground, and define KPIs that could satisfy multiple departments.
A · Action
I organized workshops with stakeholders to surface pain points and goals. I mapped common themes and proposed unified KPIs that addressed multiple needs. Using Databricks, I built data pipelines that supported these KPIs, ensuring technical scalability alongside governance.
R · Result
Consensus adoption of the unified KPIs reached ~0% across departments. Reporting conflicts decreased significantly, and leadership gained consistent visibility across teams. The project improved trust and created a governance template for future KPI initiatives.
Stack: Databricks, Python, Data Mining
Summary: I resolved a persistent SLA reporting failure at Walgreens by collaborating directly with technical teams and rebuilding trust through hands-on problem-solving.
S · Situation
A special character handling issue caused Walgreens’ SLA reporting system to fail intermittently. Teams defaulted to manual extraction as a workaround, but no one attempted a permanent fix. Leadership was frustrated, and trust in IT support was declining.
T · Task
I needed to own the issue, identify the root cause, and build a permanent fix that stakeholders could trust. The solution had to balance vendor restrictions with practical reliability.
A · Action
I engaged multiple teams to trace the failure source, analyzed data lineage, and designed a custom load process to handle problematic cases. Since vendor jobs could not be altered, I created an external complementary process. I worked alongside technical teams directly, showing commitment beyond managerial alignment.
R · Result
System stability improved by ~0%, with manual extractions eliminated. SLA compliance improved across multiple reporting cycles. Most importantly, stakeholder trust was rebuilt as they saw me personally take ownership of resolving the issue.
Stack: PL/SQL, Oracle Data Integrator, Oracle BI Apps
Dec 2024 – Present · Deerfield, IL
Senior Data Scientist — Walgreens (TCS)
Databricks forecasting; ADF automation; modular Python components.
Jul 2023 – Nov 2024 · Chicago, IL
Sr. Data Analyst & Technical Architect — UnitedHealthcare (TCS)
Governed KPIs; Tableau → Power BI; SSIS & SQL Agent orchestration.
Jul 2022 – Jun 2023 · Chicago, IL
Data Analytics Lead — Bombardier (TCS)
Metrics for aero ops in Databricks; Python ETL; leadership insights.
Mar 2022 – Jun 2022 · Chicago, IL
Data Analyst & Data Governance Lead — Galderma (TCS)
Automated KPI reporting; Databricks packages; 25%+ manual reduction.
Oct 2017 – Feb 2022 · Deerfield, IL
Data Analytics & EPM Lead — Walgreens (TCS)
Financial models for OKRs; forecasting; resolved ~75% process issues.
Jun 2013 – Sep 2017 · Deerfield, IL
Master Data Management (MDM) and Analytics Lead (TCS)
Financial models for OKRs; forecasting; resolved ~75% process issues.
Oct 2011 – May 2013 · Deerfield, IL
Senior Data Analyst and Report Developer (TCS)
Financial models for OKRs; forecasting; resolved ~75% process issues.
Nov 2010 – Nov 2011 · Deerfield, IL
Data Analyst and Report Developer (TCS)
Financial models for OKRs; forecasting; resolved ~75% process issues.
Dec 2009 – Oct 2010 · Deerfield, IL
Oracle Database Administrator (TCS)
Financial models for OKRs; forecasting; resolved ~75% process issues.