Synaptiq Business Intelligence Project

1) Problem definition & stakeholder alignment

What I did: I started by clarifying the core business problem with stakeholders and translated it into clear objectives: automate reporting, create a single source of truth, and deliver actionable KPIs.
Why it was needed: Prior reporting lived in scattered Excel files and created conflicting numbers and slow decisions.
What it solved for the business owner: Provided alignment on scope and priority so every dashboard feature mapped to an actual decision or KPI the business cared about.
What I learned: Investing time early to define measurable objectives prevents scope creep and ensures the BI work delivers business value, not just charts.


2) Data ingestion & storage (Git Bash → MySQL → Power BI)

What I did: Converted compressed raw files to SQL using Git Bash and imported them into MySQL Workbench; created a dedicated connection (Business Insights 360) for Power BI.
Why it was needed: Raw files were inconsistent and not centrally queryable. A SQL landing solved schema management and enabled reliable refreshes.
What it solved for the business owner: Replaced fragile manual imports with a consistent database layer the team can maintain.
What I learned: A small amount of engineering (scripts, a reproducible load process) pays huge dividends in maintainability.


3) Star schema & dimensional modeling (data modeling)

What I did: Built a star schema: a central fact_actual_estimates and clear dimension tables (product, customer, date, market), and applied snowflake normalization where needed (sub_zone, category).
Why it was needed: Flattened Excel-style tables cause duplication, slow queries, and model ambiguity (many-to-many issues).
What it solved for the business owner: Faster queries, consistent metrics across departments and easier report-building for future projects.
What I learned: Good modeling reduces downstream DAX complexity — investing in schema design saves time and improves accuracy.


4) Power Query transformations & Query Folding

What I did: Cleaned and standardized raw data in Power Query: renamed tables, unpivoted manufacturer columns, created dim_date, appended actuals and forecasts into fact_actual_estimates, merged prices and deductions, and removed unnecessary columns before load. I designed queries to allow query folding back to MySQL where possible.
Why it was needed: Raw sources had inconsistent formats, extra columns, and different shapes; folding keeps heavy work on the database.
What it solved for the business owner: Smaller, faster datasets and predictable refresh behaviour — reduced load time and faster report opening.
What I learned: Label steps, group queries, uncheck unused loads, and prefer transformations that fold — they materially improve performance.


5) Date handling & fiscal calendar (dim_date + fiscal logic)

What I did: Built a comprehensive date dimension in Power Query; created fiscal year logic by shifting months (+4) and exposing fiscal descriptors (fy_desc) so the latest fiscal year could be labeled “EST”. I also produced fiscal_year as a table inside Power BI for robust filtering.
Why it was needed: The company’s fiscal year did not align with calendar year, and time-intelligence relies on a correct date dimension.
What it solved for the business owner: Accurate YTD/YTG, quarter calculations, and consistent fiscal filtering across reports.
What I learned: Always build a full date table early — time intelligence and all DAX time functions depend on it.


6) Core financial measures & business KPIs (DAX)

What I did: I defined a set of base measures and columns that serve across all dashboards: Gross Sales (GS $), Net Invoice Sales (NIS $), Pre-Invoice Deduction, Post-Invoice Deductions (two types), Net Sales (NS $), Quantity, Manufacturing/Freight/Other Costs, Total COGS, Gross Margin (GM $ and GM %), GM per Unit, Net Profit $ and Net Profit %. I organized these measures into a Key Measures table and Display Folders.
Why it was needed: Centralized, well-named measures prevent mismatched numbers across reports and let visuals simply reference authoritative metrics.
What it solved for the business owner: One source-of-truth P&L logic usable by finance and leadership; no more Excel rework to reconcile KPIs.
What I learned: Keep critical business logic as measures (not calculated columns) where possible for performance and consistency.


7) Composite P&L mechanics & dynamic reporting (P&L Rows / P&L Values)

What I did: Built a P&L Rows table to define statement order, a P&L Values measure that dynamically returns the correct metric for each P&L line, last-year (P&L LY) and YoY change measures, and a P&L Final Value to allow dynamic column headers (Actual / LY / BM / Chg / Chg%). I used SWITCH(TRUE()) logic to map row order to the desired metric.
Why it was needed: Executive reports must show rows consistently (Gross Sales → Deductions → COGS → GM → Opex → Net Profit) and allow pivoting between benchmarks.
What it solved for the business owner: A single, readable P&L table that can switch between actuals, last year, and benchmark (targets) without building multiple static visuals.
What I learned: Using a dedicated control table plus SWITCH(TRUE()) makes complex pivot-like behaviors in Power BI robust and reusable.


8) Targets & benchmarking (targets import + vs LY / vs Target toggle)

What I did: Imported target values from Excel (net sales, gm, net profit), created target measures (NS Target $, GM Target $, NP Target $ and target percentages), created a small toggle table to switch between “vs LY” and “vs Target”, and coded measures to blank targets when users drill to product/customer levels where targets don’t exist. I added a BM Message to surface missing target warnings.
Why it was needed: Leaders prefer to know both historical performance and progress against goals. Targets were only at country level, so logic is needed to avoid misleading comparisons at lower granularity.
What it solved for the business owner: Executives can instantly flip between trends vs last year and performance vs plan; safeguards prevent misinterpretation when targets are unavailable.
What I learned: Anticipate granularity mismatches and explicitly handle them; blanking and messaging avoids bad business decisions.


9) Forecasting accuracy & supply chain KPIs (Net Error, Absolute Error, Forecast Accuracy)

What I did: Separated Sales Qty vs Forecast Qty using the last_sales_month reference; created Net Error and Net Error %; used SUMX with month and product granularity to compute Absolute Error and Absolute Error % so positives/negatives don’t cancel out; defined Forecast Accuracy = 1 − ABS Error % and time-intel measures for LY values. I also created a Risk measure (Excess Inventory / Out of Stock) from net error sign.
Why it was needed: Aggregating errors directly hides real forecasting problems; we needed a granular method to surface true accuracy.
What it solved for the business owner: A trustworthy supply chain dashboard that reveals forecasting issues, enables corrective actions (inventory rebalancing, urgent orders) and quantifies improvement over time.
What I learned: SUMX + granular iteration is essential for correct aggregate error metrics — never rely on simple aggregates for error calculations.


10) Advanced DAX techniques — VAR, RELATEDTABLE, CALCULATE and performance patterns

What I did: I used variables (VAR) inside measures for clarity and performance, RELATEDTABLE and CALCULATE to fetch customer/product-level deductions and costs, and MAX to pull single-row values when appropriate. I wrapped repeated logic into reusable measures to avoid duplication.
Why it was needed: Complex business logic needs to be readable and fast; VAR reduces repeated calculations and makes debugging easier.
What it solved for the business owner: Faster report responsiveness and easier maintenance of business logic.
What I learned: Writing measures with VAR and breaking logic into small, named steps makes DAX far more maintainable and often faster.


11) Interactive UX: SWITCH(TRUE()), toggles, bookmarks, selection pane & what-if parameters

What I did: Implemented SWITCH(TRUE()) patterns to return the right metric based on selection, built toggle buttons and bookmarks to switch visuals (e.g., showing Net Profit % vs GM % visuals in marketing), added a What-If numeric parameter (“Target Gap Tolerance”) and built GM % Variance and a filter measure to highlight customers below tolerance. I used the Selection Pane and bookmarks to manage overlapping visuals and buttons.
Why it was needed: Users need flexible interaction — choose metrics, apply tolerance thresholds, and switch views without page reloads.
What it solved for the business owner: Non-technical users can run scenario checks and focus on problem areas quickly (e.g., customers with big margin gaps).
What I learned: Combining SWITCH(TRUE()), bookmarks, and what-if parameters produces very intuitive UX. Bookmarks + Selection Pane is a simple but powerful way to build feature toggles without heavy coding.


12) Market share & unpivot techniques for competitive analysis

What I did: Imported a market-share file, unpivoted multiple manufacturer columns into manufacturer + sales_$ rows, extracted manufacturer names via text-before-delimiter, created a manufacturer dimension, and calculated Market Share % as manufacturer sales divided by total market sales. I visualized trends using a ribbon chart.
Why it was needed: Source file had wide format manufacturer columns — unpivoting made the data analysis-ready and consistent.
What it solved for the business owner: Clear, comparable market-share trends that leadership can use for strategic decisions (pricing, promotions, distribution).
What I learned: Unpivoting tidy data is often the simplest path to flexible analysis.


13) Tooltips, drill insights & micro-interactions

What I did: Built a tooltip report page (Trend) sized for popups showing NS $ and GM % over months, created a dynamic title using selected customer, and connected the tooltip to visuals. Also added drill-through and contextual labels for clarity.
Why it was needed: Tooltips allow quick micro-analysis without leaving the main page, improving exploration speed.
What it solved for the business owner: Faster investigations; operational teams can see trends on hover and decide if deeper action is required.
What I learned: Tooltips and micro-interactions dramatically raise the perceived quality of dashboards.


14) Operational expenses integration & Net Profit plumbing

What I did: Brought operational expenses from Excel into the model, mapped ads/promotions and other OPEX as calculated columns, and created Operational Expense $, Net Profit $, and Net Profit % measures; appended P&L rows to include these lines. Modeled operating expenses as negatives in waterfall charts so totals rendered intuitively.
Why it was needed: Finance wanted a full P&L that included operating levers, not just COGS and GM.
What it solved for the business owner: Ability to see the impact of operational spend on net profitability and run sensitivity tests.
What I learned: Presentation (treating OPEX as negative) matters — it makes waterfall and P&L narratives immediately understandable.


15) Performance optimization & dataset sizing (unload large tables, reduce columns)

What I did: Disabled loading of heavy staging tables when not needed (e.g., fact_sales_monthly after extracting last_sales_month), pruned unused columns, and compressed numeric types where possible. I also ensured heavy transformations that could be folded ran on the SQL side.
Why it was needed: Large imported datasets slow refresh and interaction.
What it solved for the business owner: Faster refresh times, snappier visuals, and lower maintenance overhead.
What I learned: Model discipline (only load what you need) is as important as clever DAX for performance.


16) Scheduled refreshes and “last refreshed” visibility

What I did: Configured dataset scheduled refresh in Power BI Service (using gateway for on-prem or direct connections), and captured the dataset refresh timestamp in Power Query (DateTime.LocalNow()) to display a “Last Refreshed” label in reports.
Why it was needed: Users must trust that numbers are current; visible refresh timestamps build confidence.
What it solved for the business owner: Reduced manual checks and fewer “is this up-to-date?” questions in meetings.
What I learned: Showing refresh metadata is a small UX win that increases the dashboard’s credibility.


17) Validation, reconciliation & governance

What I did: Built matrix validations (gross_sales = price × qty, pre-invoice deductions calculations, net_invoice_sales reconciliation) and reconciled them with Finance and Sales benchmarks. I also introduced messages for missing benchmarks and blanking logic for unavailable targets.
Why it was needed: To ensure calculated values are trustworthy and to catch data problems early.
What it solved for the business owner: Reduced risk of acting on bad data; faster root-cause discovery when numbers don’t match.
What I learned: Validation steps should be part of the deliverable — they’re not optional.


18) Delivery, documentation & handover

What I did: Documented data flow, DAX notes, PQ steps and published a project page containing methodology, visuals, and download links. Organized measures into display folders and provided a short user guide for toggles and filters.
Why it was needed: Dashboards require operational knowledge for long-term adoption.
What it solved for the business owner: Smooth handover to BI consumers and easier future enhancements.
What I learned: Good documentation multiplies the impact of a BI project — it reduces support requests and accelerates adoption.


Final outcome (business impact summary)

  • I turned fragmented Excel processes into a centralized, automated BI solution that produces a single source of truth across Finance, Sales, Marketing, Supply Chain, and Executive views.
  • The model delivered fast, auditable KPIs (net sales, gross margin, net profit), reliable forecasting diagnostics (Net Error, Absolute Error, Forecast Accuracy), and strategic insights (market share, revenue contribution).
  • Executives received both historical context (vs LY) and actionable benchmarks (vs targets), with interactive controls to drill, filter, and run scenario tests.