Data pipeline
Scheduled job authenticates to Salesforce (simple_salesforce
), extracts objects, and writes clean parquet/CSV for GIS.
A few things I’ve built and shipped. Most repos include a README with data, steps to run, and visuals.
Python + Salesforce integration to fetch well records, clean with pandas, and render interactive layers in ArcGIS for stakeholders.
Scheduled job authenticates to Salesforce (simple_salesforce
), extracts objects, and writes clean parquet/CSV for GIS.
Layers styled by status/type with drilldowns; supports stakeholder queries and exports.
Daily refresh with error logging and email notifications; validation checks on row counts and schema.
Automated intake→dashboard pipeline reduced response time from 4 days to ≤3 hours by removing manual handoffs.
Python parses structured emails/Excel, normalizes fields, and writes a single source-of-truth table.
KPIs for SLA, backlog, and throughput with DAX measures; auto-refresh via gateway.
Single-pane status visibility; bottleneck alerts; consistent audit history for handoffs.
Educational app that uses PokéAPI to track collection progress and show generation/category insights.
Matplotlib charts for counts by type/gen and collection progress.
Clean README with setup steps and screenshots for quick review.
Have a role or collaboration in mind? I love automating tedious workflows and building clear visuals from real-world data.