The simplest path that works for most people
The simplest path that works for most beginners
-
Set up like an analyst (Week 0–1)
Create a clean folder structure, start a GitHub repo, and write a 1-page metric definition template. Employers reward reproducible work.
-
Become strong in spreadsheets (Week 1–3)
Build one “messy-to-clean” workbook plus a 1-page KPI sheet. This is the fastest path to real entry-level reporting work.
-
Learn SQL properly (Week 2–7)
Solve 40–60 SQL questions and complete one small SQL case study. SQL is the #1 employability lever for junior analytics roles.
-
Choose ONE BI tool (Week 6–10)
Pick Power BI or Tableau and publish a clean KPI dashboard + a short insight summary.
-
Publish a capstone + apply with proof (Week 12–16)
Ship one polished end-to-end capstone (SQL + dashboard + memo). Use one portfolio link in every application.
Data Analytics Career Path (Beginner) — Track 1
A structured entry path into junior analytics roles. Build solid foundations, get strong in spreadsheets and SQL, pick one BI tool (Power BI or Tableau), add light Python for repeatability, and publish a capstone project with an Analyst Proof Pack you can share in interviews.
Fast facts
- Level: Beginner (no experience required)
- Time: Fast 8–10 weeks • Standard 12–16 weeks • Busy 4–6 months
- Weekly effort: 3–15 hrs/week (depends on pace)
- Core output: Analyst Proof Pack + 1 capstone case study
- Tools: Sheets/Excel, SQL + (choose one) Power BI or Tableau (+ light Python recommended)
- Target roles: Junior Data Analyst, Reporting Analyst, BI Analyst (junior)
Jump to
Who this is for
- IGNOU learners and early-career professionals who want a structured entry path into analytics.
- Students who prefer proof-of-work (workbook, SQL repo, dashboard, memo) over theory-only learning.
- Career switchers targeting roles like Junior Data Analyst, Reporting Analyst, or BI Analyst (junior).
- Beginners who want clarity on what to learn first (Excel → SQL → BI) without falling into “data science overload.”
Time required (realistic estimates)
This roadmap is flexible. Most beginners progress fastest by building spreadsheets + SQL first, then publishing one dashboard and a capstone.
- Fast track: 8–10 weeks (10–15 hrs/week) — foundations + spreadsheets + SQL basics + one BI dashboard.
- Standard pace: 12–16 weeks (6–10 hrs/week) — most students and working learners.
- Busy schedule: 4–6 months (3–5 hrs/week) — steady progress without burnout.
Optional add-ons (only if needed)
- Second BI tool (not recommended for beginners): +2–4 weeks
- Stronger Python (beyond “light Python”): +4–8 weeks
- Interview drilling (SQL + dashboard walkthroughs): +2–3 weeks
Outcomes (what you can do after this path)
- Define metrics clearly using a metric dictionary (formula, grain, filters, caveats).
- Clean and analyze data in Excel/Sheets using pivots, lookups, QA checks, and clear charts.
- Write correct, readable SQL with joins, CTEs, grouping, and date logic.
- Build a clean BI dashboard (Power BI or Tableau) with stable KPIs + drilldowns.
- Write a short executive insight memo (“So what / Now what”).
- Use light Python to clean and re-run analysis reproducibly (recommended, not heavy).
- Publish a capstone project that looks like workplace output (SQL + dashboard + memo).
- Apply confidently to Junior Data Analyst / Reporting Analyst roles with proof-of-work.
Prerequisites
- No prior experience required: beginner-friendly.
- Basic English reading/writing: most learning resources are in English.
- Laptop/PC + stable internet: for practice datasets, tools, and publishing.
- Willingness to publish proof: workbook + SQL repo + dashboard export + memo/case study.
Tools you’ll use
- Spreadsheets: Google Sheets or Microsoft Excel (cleaning, pivots, QA, KPI tables).
- SQL practice: any learner-friendly environment (SQLite/Postgres/BigQuery-style exercises).
- BI (choose one): Power BI or Tableau (dashboard + export for portfolio).
- GitHub: repos + README for reproducible work.
- Light Python (recommended): pandas basics in a notebook (Colab/Jupyter).
- Portfolio home: GitHub + a single index page (Notion / Drive / simple webpage).
Roadmap
Step 1 (Week 0–1): Analyst fundamentals (required baseline)
Your goal this week is to set up like a professional analyst. Employers care less about course completion and more about whether you can produce organized, reproducible work (even on small datasets).
- File organization + documentation: clean folder structure, naming conventions, basic assumptions log.
- Git/GitHub basics: commit, push, README, and a clear repo structure.
- Data formats: CSV vs JSON, what a schema is, and why “grain” matters.
Suggested open resources::
- GitHub Skills: Learn Git + GitHub with hands-on exercises
Optional structured certificate:
- Google Data Analytics Professional Certificate (Coursera)
- IBM Data Analyst Professional Certificate (Coursera)
Deliverables:
- Repo:
analytics-foundationswith a README describing your workflow and tools. - Metric definition template (1 page): metric name, formula, grain (daily/user/order), filters, exclusions, owner, and “known caveats”.
Guided learning Courses
If you prefer a guided course format, or structured practice, use the curated option below.
Guided learning Courses: View recommended Data Analytics courses
Step 2 (Week 1–3): Spreadsheets for analysis (Excel / Google Sheets)
Many entry-level roles begin with spreadsheets. Your goal is to become “dangerous” in spreadsheet analysis and reporting hygiene.
- Pivots + calculated fields: summarize, segment, and trend cleanly.
- Lookup logic: XLOOKUP / INDEX-MATCH, text cleanup, date handling.
- Data QA basics: duplicates, missing values, validation rules.
- Chart hygiene: correct scales, clear labels, and no misleading visuals.
Optional credential: Excel Skills for Business Specialization (Coursera — Macquarie)
Deliverables:
- Messy-to-clean workbook: raw tab → cleaned tab → pivot tab → executive summary tab.
- 1-page KPI sheet: includes a metric dictionary aligned to your Step 1 template.
Step 3 (Week 2–7): SQL fundamentals (your #1 employability lever)
SQL is the most consistently requested skill for junior analytics roles. Your goal is to write correct, readable SQL and explain join logic without confusion.
- Must-know SQL: SELECT, WHERE, GROUP BY, HAVING
- JOINs: inner/left joins and handling duplicates after joins
- Structure: CTEs, CASE WHEN
- Dates: date functions
- Logic: basic subqueries
Proficiency targets:
- You can explain join behavior at different grains (user-level vs order-level).
- You can write readable SQL using CTEs and comments.
Optional structured courses (names only):
- SQL for Data Science (Coursera — UC Davis)
- The Complete SQL Bootcamp (Udemy)
Verified resources:
- SQLBolt: Interactive SQL fundamentals
- Kaggle Learn — Intro to SQL: Practice SQL using BigQuery-style exercises
- Mode SQL Tutorial: Work-like SQL tutorial and exercises
Deliverables:
- Repo:
sql-practicewith 40–60 solved questions (with comments). - SQL case study: one business question answered with 4–6 queries + short write-up.
Step 4 (Week 6–10): BI dashboards (choose ONE tool)
Pick one BI tool and commit. Your goal is to produce dashboards that resemble workplace outputs: clear KPIs, definitions, drilldowns, and a clean narrative.
- Model basics: relationships, data model fundamentals
- Conceptual star schema: facts + dimensions
- Measures/calculations: calculations, filter interactions, drilldowns
- UX discipline: clear titles, definitions, annotation
Tool track (choose ONE):
-
Option A: Power BI
- Microsoft Learn — Get started with Microsoft data analytics: Begin your Microsoft analytics journey
- Power BI documentation hub: Power BI documentation + learning resources
Optional credential : Microsoft Certified: Power BI Data Analyst Associate (PL-300)
-
Option B: Tableau
- Tableau Analyst Learning Path: Official learning path (skills + best practices)
- Tableau training entry point: Training hub (including free options)
- Tableau Public (portfolio publishing): Publish dashboards publicly for portfolio review
Optional credential: Salesforce Certified Tableau Data Analyst
Deliverables:
- KPI overview dashboard: executive view (stable KPIs + trends).
- Diagnostic page: breakdowns by product/region/channel.
- PDF export: plus a 10-bullet insight summary (what changed, why, now what).
Step 5 (Week 8–12): Statistics essentials (avoid wrong conclusions)
You do not need advanced math. You do need correct reasoning about uncertainty, bias, and interpretation.
- Mean vs median, variance, distributions
- Correlation vs causation
- Sampling bias and seasonality
- Confidence intervals (conceptual)
- When to use significance tests (and when not to)
Verified resources:
- OpenStax Introductory Statistics (2e): Free textbook (credible baseline)
- OpenIntro Statistics: Free textbook + exercises
Deliverable: one short “analysis memo” explaining uncertainty and limitations (even without formal tests).
Step 6 (Week 10–14): “Light Python” for repeatability (recommended)
Python raises your ceiling because you can clean data reproducibly and re-run analysis easily. Keep it lightweight at first: focus on data loading, cleaning, grouping, merging, and exporting.
- pandas essentials: read CSV, filter, groupby, merge, missing values
- Basic plots: quick diagnostics and trend checks
- Export outputs: clean dataset for BI + summary tables
Verified resources:
- Python documentation (official tutorial): Python Tutorial
- pandas user guide: Official pandas documentation
Deliverable: one notebook: raw → clean → analysis summary → export.
If Python feels heavy, keep it minimal here—but don’t skip it entirely if you can.
Step 7 (Week 12–16): Portfolio capstone + job-search execution
One polished, end-to-end project beats five partial ones. Your capstone must look like real work: clear question, reproducible analysis, dashboard, and a short executive memo.
Capstone requirements:
- A realistic business scenario (sales, marketing, churn, operations)
- A metric dictionary (from Step 1 template)
- SQL extraction/transformation (from Step 3)
- Dashboard + executive memo (1–2 pages)
What a hiring manager should see in 60 seconds:
- Clear question + decision context
- Clean dashboard with definitions
- Short insight memo: “So what / Now what”
- Reproducible work: SQL + README + (optional) Python notebook
Beginner portfolio project ideas (choose 2 total):
- E-commerce performance: revenue, AOV, conversion proxies, repeat rate
- Marketing campaign analysis: CAC proxy, channel ROI, cohort retention
- Support/ops: SLA compliance, backlog trends, staffing implications
- HR analytics: attrition trends and drivers (avoid sensitive/identifying details)
Beginner job-search playbook (start Week 8 in parallel):
- Apply to roles emphasizing reporting, dashboards, Excel, SQL (not “data scientist”).
- Tailor your resume to: tools (SQL/BI) + 2–3 quantified portfolio outcomes + stakeholder communication.
Interview readiness checklist:
- Explain a join and grain in plain language.
- Explain a KPI definition and caveats.
- Walk through a dashboard: what changed, why, what to do next.
- Explain one analysis mistake you prevented (data quality, duplicates, seasonality).
Advancing technologies to be aware of (no mastery required yet):
- BI copilots / GenAI assistants: useful for drafting DAX/SQL or narratives—always validate outputs.
- Semantic layers / metric definitions: single-source-of-truth thinking for consistent KPIs.
- Privacy basics: never expose PII in portfolios; sanitize datasets and screenshots.
Deliverable: a “Beginner Proof Pack” (capstone repo + dashboard export + executive memo + metric dictionary).
Portfolio (Beginner Analyst Proof Pack)
Keep your portfolio simple: one capstone + 6 core artifacts. This is enough for most junior analytics interviews.
1) Featured Case Study (1 page)
- Business question + decision context (why it matters)
- Metrics used (definitions + caveats)
- What you found + what you recommend (“So what / Now what”)
- Links to the artifacts below
2) Core Analyst Artifacts (6 items)
- Metric dictionary (1 page)
- Clean workbook (raw → cleaned → pivots → summary tab)
- SQL repo (40–60 solved questions with comments)
- SQL mini case study (one business question with 4–6 queries + write-up)
- BI dashboard export (KPI overview + diagnostic page)
- Executive insight memo (1–2 pages)
3) Optional “Light Python” Add-on (recommended)
- Notebook: raw → clean → analysis summary → export outputs
Portfolio Rubric (Quick Self-Check)
Use this checklist to validate your portfolio. If you can tick most items, your portfolio is interview-ready.
Featured Case Study (1 page)
- Clear question + decision context (2–3 sentences)
- Defines 2–4 KPIs with formulas and caveats
- Shows analysis logic, not just charts
- Links to workbook, SQL, dashboard, memo
- Ends with 3–5 actionable recommendations
Metric Dictionary
- Metric grain is explicit (daily/user/order)
- Filters/exclusions are documented
- Known caveats are listed (missing data, bias, seasonality)
Spreadsheets
- Clear raw → clean → pivot → summary flow
- Basic QA checks exist (duplicates, missing values)
- Charts are readable (labels, scales, no clutter)
SQL
- JOIN logic is correct and explained
- Queries are readable (CTEs, comments)
- Handles grain and duplicates appropriately
Dashboard
- KPI overview + diagnostic breakdown page
- Titles and definitions are clear
- Exported cleanly (PDF/images) for portfolio review
Memo
- “So what / Now what” structure
- Mentions uncertainty and limitations
- Recommends next actions, not just observations
Final “Interview Ready” Test
- You can explain a join + grain in plain language
- You can walk through your dashboard in 90 seconds
- You can defend your KPI definitions and caveats
- Everything is linked from one portfolio page
Proof-of-work templates
Use these mini-templates to package your Beginner Analyst Proof Pack for resumes, portfolios, and interviews. Fill the inputs, then copy the output.
Resume bullet builder (Junior Data Analyst / Reporting Analyst)
Fill these inputs:
- Project: [e-commerce performance / marketing campaign / ops SLA / churn]
- Business question: [what decision this supports in 1 line]
- Tools: [Excel/Sheets] + [SQL] + [Power BI/Tableau] (+ [Python, optional])
- Artifacts: metric dictionary + cleaned dataset + 4–6 SQL queries + dashboard + 10-bullet insight memo
- Outcome: [insight found / issue prevented / time saved / KPI improved (if simulated, say “in a case study”)]
Copy/paste output:
Analyzed [project] to answer “[business question]” using [tools]; defined KPIs with a metric dictionary (grain + caveats), cleaned and QA’d raw data, built 4–6 readable SQL queries, and published a dashboard with drilldowns; summarized findings in an executive memo, resulting in [outcome].
See a real example
Analyzed an e-commerce performance case study to answer “Which channels drive repeat customers?” using Google Sheets, SQL, and Tableau; defined KPIs (Revenue, AOV, Repeat Rate) with grain and exclusions, cleaned and QA’d orders and customers data, wrote 6 CTE-based queries to segment cohorts, and published a KPI + diagnostics dashboard; summarized findings in a 10-bullet memo, identifying one channel with higher repeat rate but lower AOV and recommending a retention-focused budget split.
Featured case study (1 page)
Rule: Keep it scannable. A hiring manager should understand it in 60 seconds.
Copy/paste output:
Project: [name]
Decision context: [who needs this + what decision it informs]
Question: [primary business question]
Data: [source] | timeframe: [dates] | grain: [user/order/day]
KPIs (with definitions): [KPI 1], [KPI 2], [KPI 3] (link metric dictionary)
Method: Clean + QA → SQL extraction (CTEs, joins) → analysis (spreadsheets/Python) → dashboard
QA checks: [duplicates after joins], [missing values], [outliers], [date coverage], [filter exclusions]
Key insights (3 bullets): [insight 1], [insight 2], [insight 3]
Recommendation: [what to do next + expected impact]
Artifacts: Repo (README + SQL) | Dashboard PDF | Insight memo (1–2 pages)
See a real example
Project: Marketing campaign ROI + retention (case study).
Decision context: Marketing lead needs guidance on reallocating spend next month.
Question: Which channels produce the best 30-day retention at acceptable CAC proxy?
Data: ad spend + signups + orders | timeframe: Apr–Jun | grain: user + order.
KPIs: CAC proxy, 30-day repeat rate, 30-day revenue/user (metric dictionary linked).
Method: QA raw exports → SQL CTEs to join spend→signups→orders without duplicate inflation → cohort table → Tableau dashboard.
QA: validated join keys; checked duplicates post-join; removed test users; confirmed date coverage.
Insights: Paid Social had lowest CAC but weakest 30-day repeat; Email had highest repeat; Search had strongest revenue/user.
Recommendation: shift incremental budget from Paid Social to Search + expand email capture to lift retention.
Artifacts: GitHub repo (SQL + README), dashboard PDF, 1-page memo.
Interview answer (30–45 seconds): KPI + SQL + dashboard walkthrough
How to use: Read once, then speak naturally (don’t memorize word-for-word).
Copy/paste output:
I worked on a [project] analysis to answer “[business question]” for a [stakeholder].
First, I defined the KPIs in a metric dictionary, including the grain and exclusions, to avoid misinterpretation.
Then I cleaned and QA’d the data (missing values, duplicates, date coverage) and wrote SQL using CTEs to extract the analysis table.
A key challenge was [join/grain issue], so I validated row counts before/after joins and adjusted the logic to prevent duplicate inflation.
Finally, I built a dashboard with an executive KPI page and a diagnostic drilldown page, and I wrote a short insight memo: what changed, why, and what to do next.
The result was [result/insight], and the main lesson was [lesson about QA, definitions, or stakeholder clarity].
See a real example
I worked on an e-commerce performance analysis to answer “Why did revenue drop last month?” for a sales ops stakeholder. First, I defined Revenue, Orders, and AOV in a metric dictionary with grain and exclusions like refunds and test orders. Then I QA’d the exports and wrote SQL CTEs to build a daily fact table. The challenge was a join that doubled rows when orders had multiple line items, so I aggregated to order-level before joining dimensions. I built a Power BI dashboard with a KPI overview and drilldowns by channel and region, and summarized insights and actions in a one-page memo. The result was identifying a channel mix shift plus a stockout impact, and the main lesson was to lock grain early and validate row counts at every join.
Recommended Courses
Choose guided courses when you want structured instruction or faster tool ramp-up. You don’t need to finish everything—use the modules that close your skill gaps, then ship the roadmap deliverables as your proof.
Foundation spine (pick ONE)
Google Data Analytics Professional Certificate (Coursera)
Strong guided baseline for new analysts: structured lessons, hands-on practice, and a coherent end-to-end pathway. Pair with your Step 1 deliverables (repo + metric template) so it shows up as real proof, not just completion.
IBM Data Analyst Professional Certificate (Coursera)
Solid alternative foundation spine if you prefer IBM’s teaching style or want another well-known certificate pathway. Align course outputs to your Step 3 SQL repo + Step 4 dashboard deliverables for portfolio leverage.
Spreadsheets for analysis (optional credential)
Excel Skills for Business Specialization (Coursera — Macquarie)
Guided Excel progression for analysis and reporting hygiene. Best used alongside your Step 2 deliverables: messy-to-clean workbook + KPI sheet with a metric dictionary.
SQL fundamentals (choose ONE structured course)
SQL for Data Science (Coursera — UC Davis)
Structured SQL fundamentals with a clear learning sequence. Map every topic to your Step 3 proof: 40–60 solved SQL questions + one case study write-up in your repo.
The Complete SQL Bootcamp (Udemy)
Practical, drill-heavy SQL option if you want repetition and momentum. Keep your output “portfolio-shaped”: CTEs, comments, and joins explained at different grains (user vs order).
BI dashboards (choose ONE tool track)
Microsoft Learn — Get started with Microsoft data analytics
Start here if you choose Power BI for Step 4. Use the official path and docs to build dashboards with defined KPIs, drilldowns, and clean narratives.
Tableau Analyst Learning Path (Official)
Start here if you choose Tableau for Step 4. Follow the official learning path and publish your dashboards for portfolio review.
Statistics essentials (free, verified)
OpenStax Introductory Statistics (2e)
A rigorous but accessible foundation for interpreting uncertainty correctly. Use it to prevent common analyst mistakes: confusing correlation/causation, ignoring bias, or overstating small changes.
OpenIntro Statistics
Strong companion text with practice problems. Ideal for translating concepts into the “analysis memo” style deliverable in Step 5.
“Light Python” for repeatability (free, official)
Python documentation (official tutorial)
Use this to get comfortable with the language quickly. Your goal is not “software engineering”— it is repeatable analysis: load → clean → summarize → export.
pandas user guide
Go straight to the practical blocks: reading data, filtering, groupby, merge, missing values, and exports. This maps directly to your Step 6 notebook deliverable.
Optional credentials (only if you need a target)
Microsoft Certified: Power BI Data Analyst Associate (PL-300)
A strong “signal” credential if you choose the Power BI track in Step 4. Best taken once you can build a KPI overview dashboard + a diagnostic page with clear definitions and drilldowns.
Salesforce Certified Tableau Data Analyst
Best if you choose Tableau for Step 4 and want a formal certification target. Pair this with a published Tableau Public dashboard and a short insight memo (PDF) so your portfolio shows real analyst output.
Common Beginner Mistakes (and how to avoid them)
1) Collecting certificates instead of producing proof
Fix: publish a small Analyst Proof Pack (workbook + SQL repo + dashboard + memo). Hiring managers prefer evidence of skill over course badges.
2) Skipping SQL (or learning it too lightly)
Fix: treat SQL as non-negotiable. Solve 40–60 questions and complete one SQL mini case study (4–6 queries + short write-up).
3) Trying to learn every tool at once
Fix: choose one BI tool (Power BI or Tableau) and stick with it. Tool-switching kills momentum and weakens portfolio coherence.
4) Vague KPIs and unclear metric definitions
Fix: define each KPI with formula + grain (daily/user/order) + filters/exclusions + caveats. Ambiguity is a common reason portfolios fail interviews.
5) Ignoring data quality and QA
Fix: include QA checks in every project (duplicates, missing values, invalid dates, outliers). Show that you prevent mistakes, not just create charts.
6) Building dashboards without a narrative
Fix: export a dashboard and attach a short insight summary: “What changed, why, and what to do next.” Dashboards without decisions look like practice, not work.
7) Overdoing statistics and underdoing interpretation
Fix: master basics (bias, seasonality, correlation vs causation). You don’t need heavy math early—just correct reasoning and clear limitations.
8) Publishing risky data (PII) in portfolios
Fix: never publish personal or identifying data. Use public datasets or sanitized samples and avoid screenshots showing emails, names, or IDs.
9) Weak documentation (no README, unclear steps)
Fix: add a clean README: data source, steps, assumptions, how to reproduce outputs. Reproducibility is a hiring signal.
10) Applying too late (waiting until “perfect”)
Fix: start applying in parallel around Week 8. Your portfolio improves faster when you’re seeing real job descriptions and interview feedback.
Why Students Choose This Career Path
1) It is built for beginners (no experience required)
You start with practical analyst fundamentals: clean work habits, metric definitions, and reproducible workflows—before jumping into tools.
2) It produces proof-of-work, not just course completion
You publish an Analyst Proof Pack: a clean workbook, a SQL repo, a dashboard export, and a short executive memo—exactly what hiring managers want to see.
3) It focuses on the highest ROI skills first (Excel + SQL)
Most beginner roadmaps overemphasize “advanced” topics. This path prioritizes spreadsheets and SQL because they drive the most entry-level interviews.
4) One BI tool (no confusion, no tool overload)
You choose Power BI or Tableau and commit. That makes your portfolio coherent and prevents beginner burnout from trying everything.
5) It teaches professional analytics habits
You learn the “workplace operating system”: clear KPI definitions, QA checks, readable queries, dashboard hygiene, and short decision-oriented narratives.
6) It fits college and working schedules
The plan is modular: you can complete it fast (8–10 weeks) or gradually (4–6 months) without breaking the sequence.
7) It aligns with real entry-level roles
Junior analysts are often hired to support reporting, dashboards, and stakeholder updates. This path builds those exact signals with a capstone that looks like real work.
FAQs (Beginner Data Analytics Career Path)
1) Is this path suitable if I have zero experience?
Yes. This path is designed for beginners: you build fundamentals, then publish proof-of-work (workbook, SQL, dashboard, memo) that demonstrates job-ready skills.
2) Do I need to complete Google Data Analytics specifically?
No. You can choose a structured certificate (Google/IBM) or use free resources. The core requirement is producing the portfolio outputs, not collecting multiple certificates.
3) How long does the full path take?
Most learners finish in 12–16 weeks at a steady pace. Faster learners can finish in 8–10 weeks. Busy schedules typically take 4–6 months.
4) What jobs can I apply for after completing Track 1?
Typical entry roles include Junior Data Analyst, Reporting Analyst, BI Analyst (junior), Operations Analyst, and analyst internships.
5) Do I need a degree in statistics or computer science?
No. Many entry roles prioritize spreadsheets, SQL, dashboards, and communication. You should understand basic statistics concepts, but you don’t need advanced math to start.
6) Do I need Python?
Python is recommended but not required at the start. “Light Python” helps you clean data and repeat analysis. If Python feels heavy, keep it minimal—don’t let it block progress.
7) Power BI vs Tableau — which should I pick?
Pick Power BI if you’re targeting corporate roles where Microsoft tools are common. Pick Tableau if you want strong public portfolio publishing via Tableau Public. Choose one and commit.
8) What should my capstone look like?
One end-to-end project: a realistic business question, a metric dictionary, SQL work, a dashboard export, and a 1–2 page executive memo with recommendations.
9) Can I use public datasets for my portfolio?
Yes. Public datasets are fine. Make your work credible by documenting assumptions, cleaning steps, and limitations. Never publish personal or identifying data.
Related learning paths
Next steps
Block your first 30-minute session this week and complete the Start Week 1 milestone.
Start Week 1