Role Purpose & Context
Role Summary
The International Data Analyst is here to make sure our regional teams and senior analysts have the reliable data they need, when they need it. You'll spend your days gathering, cleaning, and preparing datasets from various international sources, which directly impacts how we understand market performance and customer behaviour across different countries. You'll work closely with your manager and other analysts, translating raw data into clear, digestible reports that regional marketing or sales teams use to tweak their strategies. When you do this well, our business leaders get accurate snapshots of what's happening globally, helping them spot opportunities or fix problems quickly. If the data's a mess, or if reports are late, it means decisions are made on shaky ground, or worse, not made at all. The biggest challenge? Getting consistent data from wildly different systems and making sense of it all. But the reward? You'll learn how a global business actually runs, seeing the impact of your work on real-world decisions, and you'll become a wizard with data tools.
Reporting Structure
- Reports to: International Analytics Specialist
- Direct reports:
- Matrix relationships:
Junior International Analytics Specialist, Associate Data Analyst (Global), Technical Data Reporter (International),
Key Stakeholders
Internal:
- International Analytics Team (your manager and peers)
- Regional Marketing Teams (EMEA, APAC, LATAM)
- Regional Sales Teams (Country Managers)
- Product Data Team (for data definitions)
External:
Organisational Impact
Scope: Your work provides the foundational data that underpins all international business decisions. Accurate and timely reports from you mean regional teams can react faster to market changes, optimise campaigns, and understand customer behaviour. Essentially, you're building the bedrock for data-driven growth across our global operations. Get it right, and everyone benefits from clearer insights; get it wrong, and we're flying blind in complex international markets.
Performance Metrics
Quantitative Metrics
- Metric: Data Accuracy
- Desc: The precision of the data you extract and prepare for reports. This means no missing values, correct data types, and accurate aggregations.
- Target: <1% error rate on all manually pulled data
- Freq: Weekly/Monthly during peer reviews and spot checks
- Example: If you pull a report on Q3 revenue for Germany, it should match the official source exactly. A £500,000 discrepancy in a £5M report would be a 10% error, which is too high. We're aiming for virtually perfect.
- Metric: Report Turnaround Time
- Desc: How quickly you complete routine data requests and update recurring dashboards once you've been assigned the task.
- Target: 95% of standard ad-hoc data requests fulfilled within 48 hours
- Freq: Tracked via Jira tickets and project management tools
- Example: A Country Manager asks for a specific customer segment's engagement metrics on Monday morning. You should have that data back to them by Wednesday morning at the latest, assuming it's a standard request.
- Metric: Data Automation Contribution
- Desc: Your ability to identify and implement small automation improvements, reducing manual effort for yourself and the team.
- Target: Automate one weekly report or data preparation step within your first 6 months, saving at least 4 hours of manual work per week.
- Freq: Reviewed during quarterly performance discussions
- Example: You notice you're manually downloading a CSV from a regional system every Monday morning. You write a small Python script to pull that data directly, saving you a couple of hours each week.
- Metric: Adherence to Data Governance
- Desc: Following established protocols for data handling, privacy, and security, especially important with international data.
- Target: Zero breaches of data privacy or security policies
- Freq: Ongoing monitoring and incident reporting
- Example: You ensure that any personally identifiable information (PII) from EU customers is handled strictly according to GDPR guidelines, never storing it in unapproved locations or sharing it inappropriately.
Qualitative Metrics
- Metric: Active Learning & Skill Development
- Desc: Your proactive engagement with new tools, methodologies, and domain knowledge relevant to international analytics.
- Evidence: Asking thoughtful questions during team meetings, completing assigned online courses, demonstrating new SQL functions or Python libraries in your work, sharing interesting articles or insights with the team, seeking feedback on your code and analysis.
- Metric: Process Adherence & Documentation
- Desc: How well you follow established data processes and contribute to clear, concise documentation for your work.
- Evidence: Your SQL queries are well-commented and follow team style guides. You update Confluence pages for new data sources you've used. Your data cleaning steps are reproducible. You consistently use Jira for task tracking and updates.
- Metric: Team Collaboration & Support
- Desc: Your willingness to support your immediate team, ask for help when stuck, and contribute positively to the team environment.
- Evidence: You offer to help peers with routine tasks when your plate is clear. You proactively communicate when you're blocked or need assistance. You participate constructively in team discussions. You're generally a good egg to work with.
- Metric: Understanding of Business Context
- Desc: Your growing ability to connect the data you're working with to the actual business questions and regional challenges.
- Evidence: You can explain *why* a regional team needs a particular metric, not just *what* the metric is. You start to anticipate follow-up questions from stakeholders. You show curiosity about the different market dynamics in, say, Brazil versus Japan.
Primary Traits
- Trait: Curious Learner
- Manifestation: You're the type who, when faced with a new data source or a weird number, immediately thinks 'Why?' You'll dig into the documentation (or ask your manager) to understand where the data comes from, how it's defined, and what its limitations are. You're not afraid to admit you don't know something, but you'll always try to figure it out.
- Benefit: International data is a minefield of inconsistencies and cultural nuances. If you don't ask questions and genuinely want to understand, you'll inevitably pull the wrong numbers or misinterpret results. Our business relies on accurate data, and that starts with someone who's inherently curious about what they're looking at.
- Trait: Data Detective (in training)
- Manifestation: You're not taking any number at face value, even at this early stage. You'll spot that a currency conversion looks off or that a date format is inconsistent across countries. You'll flag potential issues to your manager, even if you don't know how to fix them yet. You're building the habit of double-checking your work before it leaves your desk.
- Benefit: One tiny error in an international report can lead to big, expensive mistakes for the business. Think about a misplaced decimal point in a revenue forecast for a new market. Your job is to be an early warning system, catching those small discrepancies before they become major problems. It's about building trust in our data.
- Trait: Organised & Methodical
- Manifestation: When you get a data request, you don't just dive in headfirst. You'll clarify what's needed, make a small plan, and then execute it step-by-step. Your SQL queries are tidy, your Python scripts are commented, and you know where all your files are. You're the kind of person who keeps a clean desk (or at least a clean digital workspace).
- Benefit: International analytics can get messy, fast. You'll be dealing with data from different countries, different systems, and often under tight deadlines. Without a structured approach, you'll quickly get lost, miss crucial details, or create work that's impossible for others to understand or reproduce. Being organised means you're reliable, and that's gold.
Supporting Traits
- Trait: Cultural Astuteness (developing)
- Desc: An innate curiosity and respect for cultural differences that informs your analysis. You're starting to recognise that what works in one country might not in another, and you're open to learning about those distinctions.
- Trait: Team-Oriented
- Desc: You're happy to ask for help when you're stuck and keen to support your teammates when they need an extra pair of hands. You understand that we achieve more together.
- Trait: Resilient to Ambiguity (learning)
- Desc: You can handle situations where the data isn't perfect or the request isn't crystal clear. You're learning to ask clarifying questions rather than getting frustrated.
Primary Motivators
- Motivator: Learning & Skill Mastery
- Daily: You'll be constantly picking up new SQL tricks, Python libraries, or dashboarding techniques. Every day offers a chance to deepen your understanding of data, tools, and international business. You'll get regular feedback and dedicated time for learning.
- Motivator: Problem Solving & Puzzle Unravelling
- Daily: A lot of your day will involve figuring out why a number looks wrong, how to join two disparate datasets, or how to present complex information simply. It's like solving a new puzzle every day, with real business impact.
- Motivator: Contributing to Global Impact
- Daily: Even at this level, your accurate data feeds into decisions that affect millions of customers across the world. You'll see your reports being used by regional teams to improve their operations and grow the business.
Potential Demotivators
Honestly, this job isn't for everyone. You'll spend a fair bit of your time on what some might call 'grunt work'—cleaning messy data, chasing down data definitions, and updating routine reports. The 'urgent' request that disrupted your Thursday might get deprioritised on Friday, and you'll often be working on tasks that are part of a much bigger picture, so you might not always see the final outcome of your efforts. If you need to see every piece of your work make it to a grand presentation or directly impact a major product launch, you might struggle here. This role is about building the foundations, which isn't always glamorous.
Common Frustrations
- The 'Data Janitor Reality': You'll spend 60% of your time cleaning, joining, and validating data from disparate regional systems with inconsistent formats (e.g., `dd-mm-yyyy` vs. `mm-dd-yyyy`), currencies, and languages. It's not always the exciting modelling you might imagine.
- The Time Zone Gauntlet (learning to navigate): You might have early morning check-ins with APAC or late evening follow-ups with the US West Coast, especially as you learn. It can mess with your routine.
- Apples-to-Oranges Comparisons: You'll be asked to compare metrics between vastly different markets (e.g., Germany vs. Indonesia) and you'll need to learn how to explain why those comparisons can be misleading.
- The 'Lost in Translation' Data: Dealing with product feedback or survey responses in multiple languages, where nuance is critical but easily lost through automated translation, can be a headache.
What Role Doesn't Offer
- Full autonomy on project design or strategy (that comes later).
- Direct management of a team (you'll be mentored, not mentoring others yet).
- Immediate high-level strategic influence (you're building the data for it, though!).
- A purely 'clean data' environment (the reality is always messier).
ADHD Positives
- The constant variety of data sources and regional contexts can keep things interesting, preventing boredom.
- The need to quickly switch between different data requests can suit individuals who thrive on varied tasks and quick pivots.
- Opportunities to deep-dive into specific data anomalies can be highly engaging for hyper-focus tendencies.
ADHD Challenges and Accommodations
- Repetitive data cleaning tasks might be challenging; we can use automation tools (like dbt or Python scripts) to minimise this where possible.
- Staying organised across multiple data sources and requests can be tough; we use Jira for task management and provide clear templates for documentation.
- Time-zone differences for meetings might require flexible scheduling; we're open to discussing adjusted work patterns to accommodate peak focus times.
Dyslexia Positives
- Strong visual thinking can be a huge asset in understanding complex data relationships and designing clear dashboards in Tableau.
- The problem-solving nature of debugging SQL queries or Python scripts can be very engaging for those who excel at pattern recognition.
- Oral communication and storytelling with data can be a strength, especially when explaining findings to non-technical teams.
Dyslexia Challenges and Accommodations
- Extensive reading of technical documentation or complex SQL queries can be difficult; we encourage the use of screen readers, text-to-speech tools, and provide clear, well-formatted documentation.
- Writing detailed comments or reports might take longer; we focus on clarity and impact, and can provide templates or AI-assisted writing tools.
- Proofreading your own work can be tricky; peer reviews are standard, and we encourage using grammar/spelling checkers.
Autism Positives
- The logical, structured nature of data analysis, SQL, and Python can be a natural fit for systematic thinking.
- A strong focus on detail and accuracy is highly valued, especially in catching errors in complex datasets.
- The ability to concentrate deeply on data patterns and anomalies can lead to exceptional insights.
Autism Challenges and Accommodations
- Navigating unspoken social cues in team meetings or with stakeholders can be challenging; we focus on direct, clear communication and provide agendas in advance.
- Unexpected changes in data requests or project priorities can be unsettling; we aim for clear communication about changes and provide as much lead time as possible.
- Sensory environment: We offer a quiet working environment, and the option for noise-cancelling headphones is always there. We're also flexible with lighting and workstation setup.
Sensory Considerations
Our main office is typically a modern, open-plan environment, but we do have quiet zones and meeting rooms available for focused work or calls. We're generally a calm, respectful team. If you prefer a quieter setup or specific lighting, we're happy to discuss adjustments to your workstation. Social interactions usually happen in scheduled meetings or via chat, so you won't get constant interruptions.
Flexibility Notes
We believe in supporting everyone to do their best work. If you have specific needs or require adjustments, please talk to us. We're committed to creating an inclusive environment and are always open to discussing flexible working arrangements or tools that can help you thrive.
Key Responsibilities
Experience Levels Responsibilities
- Level: Entry Level (0-2 years)
- Responsibilities: Extract data from our Google BigQuery data warehouse using SQL, following established queries and templates, to support weekly and monthly regional performance reports.
- Clean and transform raw datasets using Python (pandas) or dbt, ensuring data quality and consistency before it's used in analysis (yes, it's tedious but absolutely critical).
- Build and update standard dashboards in Tableau, making sure they reflect the latest data and meet the basic requirements of regional marketing and sales teams.
- Assist senior analysts with ad-hoc data requests, which usually means pulling specific numbers, summarising them, and checking for any obvious anomalies.
- Document your data sources, cleaning steps, and report methodologies in Confluence, making sure others can understand and reproduce your work (future-you will be grateful).
- Learn and apply our internal data governance policies, especially those around data privacy (like GDPR) and cross-border data handling—you'll get training, don't worry.
- Participate actively in team meetings, asking questions and sharing any data quirks you've found, helping everyone stay on the same page about our international data landscape.
- Supervision: You'll have daily check-ins with your manager or a senior analyst, especially in your first few months. All your major data pulls, cleaning scripts, and dashboard updates will be reviewed before they go live. Think of it as paired work, with plenty of guidance and support.
- Decision: Honestly, you won't have much independent decision-making authority at this level. All technical approaches (e.g., which SQL query to use, how to clean a specific dataset) and data interpretations will be discussed and approved by your manager or a senior team member. Any contact with regional stakeholders or external partners will be supervised or escalated. This is a learning role, and we're here to guide you.
- Success: You'll be successful if you consistently deliver accurate data and reports on time, actively learn new tools and methodologies, and contribute positively to the team. Catching a data error before it becomes a problem, or suggesting a small improvement to a process, will show you're on the right track.
Decision-Making Authority
- Type: Data Extraction & Query Design
- Entry: Executes pre-defined SQL queries; makes minor modifications under guidance. All new queries reviewed by manager.
- Mid: Designs and writes standard SQL queries independently; complex queries reviewed by senior. Proposes new data sources.
- Senior: Designs complex, optimised queries and data models. Defines best practices for SQL. Mentors juniors on query design.
- Type: Data Cleaning & Transformation
- Entry: Performs cleaning tasks following established scripts/guidelines. Flags data quality issues to manager.
- Mid: Independently cleans and transforms datasets for projects. Proposes and implements new cleaning rules.
- Senior: Designs and implements robust data quality frameworks. Architects data transformation pipelines (dbt).
- Type: Report & Dashboard Creation
- Entry: Updates existing dashboards and creates simple reports from templates. All new dashboards reviewed.
- Mid: Builds new dashboards from defined requirements. Makes recommendations on visualisation best practices.
- Senior: Designs and architects complex, interactive dashboards. Defines reporting standards and governance.
- Type: Stakeholder Communication
- Entry: Responds to direct data requests under supervision; escalates complex questions to manager.
- Mid: Communicates directly with regional peers on data requests and report clarifications. Presents routine findings.
- Senior: Leads discussions with senior stakeholders, presenting complex insights and recommendations. Manages expectations.
- Type: Project Prioritisation
- Entry: Works on tasks as assigned by manager. Flags workload issues to manager.
- Mid: Manages own workload for assigned projects; prioritises routine tasks. Consults manager on conflicting priorities.
- Senior: Manages workstream priorities. Makes recommendations on project sequencing to manager/leadership.
ID:
Tool: Global Feedback Synthesis
Benefit: Use AI to automatically read, translate, and summarise customer feedback (support tickets, app reviews) from dozens of languages. It'll cluster common themes like 'checkout issues in Brazil' or 'positive feature feedback in Korea', saving you hours of manual review and translation.
ID:
Tool: Early Anomaly Detection
Benefit: Deploy AI models that constantly monitor regional KPIs – think daily active users by city or conversion rates by device in each country. The AI automatically flags unusual spikes or drops, pointing you to potential problems or opportunities faster than you could ever spot them manually.
ID: ️
Tool: Accelerated Market Research
Benefit: Need a quick overview of a new market? Prompt an AI with 'Summarise key competitors, regulatory hurdles, and payment methods for e-commerce in Poland, citing sources.' Get a structured first draft of a market assessment in minutes, not days, giving you a massive head start.
ID: ️
Tool: Smart Report Summaries
Benefit: After you've pulled all the data for a report, use AI to generate different summaries. Ask it for: 1) A one-paragraph email summary for your manager, or 2) Bullet points for a regional sales team. It saves you ages re-writing for different audiences.
10-15 hours per week (once you're comfortable with the tools)
Weekly time savings potential
We'll invest around £20-£50/month per user in AI tools (e.g., GitHub Copilot, advanced LLM access) to get you started.
Typical tool investment
Competency Requirements
Foundation Skills (Transferable)
These are the fundamental skills that underpin everything you'll do. They're not just about being smart; they're about how you approach problems, work with others, and communicate your findings. We're looking for someone who demonstrates a solid grasp of these, even if they're still developing.
- Category: Communication & Collaboration
- Skills: Clear Written Communication: Can write concise emails and documentation that others can understand, even if English isn't their first language. No jargon, just plain facts.
- Active Listening: Pays attention to requests, asks clarifying questions, and makes sure they've understood the task before diving in. This saves a lot of rework.
- Teamwork: Works well with immediate team members, offers help when available, and knows when to ask for assistance rather than struggling alone.
- Category: Problem-Solving & Critical Thinking
- Skills: Structured Approach: Can break down a data request into logical, manageable steps. Knows how to follow a process to get from raw data to a finished report.
- Error Detection: Has a keen eye for inconsistencies or anomalies in data. Spots when a number 'just doesn't look right' and flags it.
- Basic Analytical Reasoning: Can understand simple data trends and draw initial, evidence-based conclusions, even if they're not yet forming complex insights.
- Category: Adaptability & Learning Agility
- Skills: Openness to Feedback: Actively seeks and incorporates feedback on their work, seeing it as an opportunity to learn and improve.
- Tool Proficiency: Quickly picks up new software and data tools. Not afraid to experiment (within safe boundaries, of course) or read documentation.
- Navigating Ambiguity (emerging): Can cope when requests aren't perfectly clear, asking the right questions to get to the core of what's needed.
Functional Skills (Role-Specific Technical)
These are the more technical and domain-specific skills you'll need to hit the ground running. We don't expect you to be an expert in everything, but a solid foundation in these areas will make a real difference.
Technical Competencies
- Skill: Data Extraction & Manipulation (SQL)
- Desc: The ability to write standard SQL queries to pull data from relational databases, including using JOINs, WHERE clauses, GROUP BY, and basic aggregations. You should be able to get the data you need without too much hand-holding.
- Level: Intermediate
- Skill: Data Visualisation & Reporting
- Desc: Creating clear, understandable charts and dashboards that communicate key metrics effectively. This means knowing which chart type to use for different data, and how to make a dashboard easy for others to read.
- Level: Intermediate
- Skill: Data Quality & Validation
- Desc: Understanding why data quality matters and being able to perform basic checks to ensure data accuracy, completeness, and consistency before it's used in reports. This is about catching obvious errors.
- Level: Basic
- Skill: International Data Context (awareness)
- Desc: A basic understanding that data from different countries can have different meanings, formats, and regulatory requirements. You're starting to grasp the complexities of global data.
- Level: Basic
Digital Tools
- Tool: Google BigQuery
- Level: Intermediate
- Usage: Writing standard SQL queries with JOINs and WHERE clauses on existing tables. Using the UI to explore schemas and preview data for international datasets.
- Tool: Tableau
- Level: Intermediate
- Usage: Connecting to data sources, building standard charts (bar, line, map), and assembling them into pre-defined dashboards for regional performance reviews. Applying filters and parameters.
- Tool: dbt (data build tool)
- Level: Basic
- Usage: Running existing dbt models to refresh data transformations and understanding the basic structure of our data pipelines. Making minor edits to model SQL under guidance.
- Tool: Python (pandas, basic scripting)
- Level: Basic
- Usage: Reading data into a pandas DataFrame, performing simple manipulations (filtering, sorting, basic cleaning), and running pre-written analysis scripts in a Jupyter Notebook for ad-hoc requests.
- Tool: Jira & Confluence
- Level: Intermediate
- Usage: Updating tickets, logging work, and following established sprint processes for analytics tasks. Reading and commenting on documentation in Confluence, especially for data definitions and methodologies.
Industry Knowledge
- Area: Basic Business Metrics
- Desc: Understanding common business metrics like revenue, customer acquisition, conversion rates, and user engagement, and how they apply in different international contexts.
- Area: Data Governance Principles
- Desc: An awareness of why data needs to be managed carefully, especially concerning privacy, security, and ethical use. You'll be learning our specific policies.
Regulatory Compliance Regulations
- Reg: GDPR (General Data Protection Regulation)
- Usage: Understanding the basic principles of data privacy for EU citizens and knowing when to escalate questions about handling sensitive data from these regions.
- Reg: CCPA (California Consumer Privacy Act)
- Usage: Recognising that US data also has specific privacy requirements and knowing where to find our internal guidelines for handling it.
Essential Prerequisites
- A Bachelor's degree in a quantitative field (e.g., Computer Science, Statistics, Mathematics, Economics, Engineering) or equivalent practical experience (e.g., a strong portfolio of data projects, relevant certifications).
- Demonstrable experience (0-2 years) in a data-focused role, internship, or significant academic project where you regularly used SQL for data extraction and analysis.
- Proven ability to create clear visualisations and reports, ideally using a tool like Tableau, Power BI, or Looker Studio.
- A genuine curiosity about international markets and how data can help us understand them better.
Career Pathway Context
These prerequisites are what we consider the absolute minimum to succeed in this entry-level role. We're looking for potential and a solid foundation, not perfection. If you've got the aptitude and a track record of learning quickly, we're keen to hear from you, even if your background isn't a perfect match on paper.
Qualifications & Credentials
Emerging Foundation Skills
- Skill: Prompt Engineering & LLM Integration (Basic)
- Why: This is critical within 6 months—it's already happening, not just a future thing. Competitors are using AI to draft reports and summarise data in minutes. Analysts who figure this out will simply be more productive.
- Concepts: [{'concept_name': 'Effective Prompting', 'description': 'Learning how to ask AI models the right questions to get useful data summaries, code snippets, or report drafts.'}, {'concept_name': 'Context Windows', 'description': 'Understanding how much information you can give an AI at once and why that matters for complex tasks.'}, {'concept_name': 'Output Validation', 'description': "Knowing that AI can 'hallucinate' and how to double-check its outputs against your raw data for accuracy."}, {'concept_name': 'AI for Code (e.g., SQL/Python)', 'description': 'Using tools like GitHub Copilot to help you write SQL queries or Python scripts faster, and debug them more easily.'}]
- Prepare: This week: Set up a free AI tool (like ChatGPT or Claude) and use it to summarise your emails or draft simple text.
- This month: Experiment with using AI to help you write or debug a simple SQL query or Python script.
- Month 2: Try using AI to summarise a complex data report into a few bullet points for your manager.
- Month 3: Share one way AI has saved you time with your team during a stand-up.
- QuickWin: Start using Claude or ChatGPT to draft email summaries, brainstorm ideas for visualisations, or generate code comments today—no approval needed, immediate benefit.
Advancing Technical Skills
- Skill: Advanced SQL & Data Modelling
- Why: As you progress, you won't just be pulling data; you'll be thinking about how data is structured and how to make it more efficient for analysis. This is crucial for handling larger, more complex international datasets.
- Concepts: [{'concept_name': 'Window Functions', 'description': 'Performing calculations across sets of rows, which is super useful for things like ranking or cumulative sums in specific regions.'}, {'concept_name': 'Common Table Expressions (CTEs)', 'description': 'Organising complex queries into readable, modular blocks, making your SQL much easier to understand and debug.'}, {'concept_name': 'Basic Data Modelling', 'description': 'Understanding how to design tables and relationships in a way that supports efficient analysis, rather than just querying existing ones.'}]
- Prepare: This quarter: Complete an online course on advanced SQL (e.g., on DataCamp or Udemy).
- Next quarter: Start refactoring your more complex queries using CTEs and window functions.
- Within 6 months: Propose a small improvement to an existing data table's structure to your manager.
- QuickWin: Look at some of the more complex SQL queries your senior colleagues write and try to understand what each part does. Ask them questions!
- Skill: Python for Advanced Data Analysis
- Why: While you start with basic pandas, Python's capabilities for statistical analysis, automation, and even basic machine learning are vast. It's the go-to language for more complex, bespoke analytical tasks.
- Concepts: [{'concept_name': 'Data Cleaning & Feature Engineering', 'description': 'More sophisticated techniques to prepare data for analysis, including handling missing values, outliers, and creating new variables.'}, {'concept_name': 'Statistical Testing', 'description': "Using Python libraries to perform A/B test analysis, correlation, and regression, which is essential for understanding what's driving international performance."}, {'concept_name': 'Automation with Python', 'description': 'Writing scripts to automate repetitive tasks beyond simple data pulls, like generating reports or sending alerts.'}]
- Prepare: This quarter: Complete a Python for Data Science course focusing on pandas and NumPy.
- Next quarter: Use Python to automate one of your weekly data cleaning processes.
- Within 6 months: Work with a senior analyst to apply a basic statistical test (e.g., t-test) to a regional dataset.
- QuickWin: Start using Python to do things you currently do in Excel. It's a great way to build muscle memory and see the power of automation.
Future Skills Closing Note
The key here is continuous learning. We don't expect you to know everything on day one, but we do expect you to be hungry to learn. We'll provide resources, mentorship, and opportunities, but ultimately, your growth will be driven by your own curiosity and effort. This isn't just a job; it's a journey of skill development in a fascinating, global field.
Education Requirements
- Level: Minimum
- Req: A Bachelor's degree in a quantitative field (e.g., Computer Science, Statistics, Mathematics, Economics, Engineering, or a related discipline)
- Alts: We're open to candidates with equivalent practical experience, a strong portfolio of data projects, or relevant certifications that demonstrate a solid understanding of data analysis principles and tools. Show us what you can do!
- Level: Preferred
- Req: A Master's degree in a quantitative field
- Alts: While not essential, a Master's can give you a deeper theoretical grounding. That said, practical experience often trumps an extra degree in our world.
Experience Requirements
You'll need 0-2 years of experience in a data-focused role. This could be through internships, academic projects where you worked extensively with real-world data, or an entry-level position that involved data extraction, cleaning, and basic reporting. We're looking for someone who's comfortable with numbers, has a basic grasp of SQL, and has messed around with data visualisations a bit. Experience with international data is a bonus, but certainly not a deal-breaker; we'll teach you that part.
Preferred Certifications
- Cert: Google Cloud Certified - Associate Cloud Engineer
- Prod: Google Cloud
- Usage: Shows a foundational understanding of Google Cloud Platform, which is where our data lives (BigQuery). It demonstrates you can navigate cloud environments, which is a big plus.
- Cert: Tableau Desktop Specialist
- Prod: Tableau
- Usage: Proves your ability to use Tableau effectively for data visualisation and dashboard creation, which is a core part of this role.
- Cert: SQL (various providers, e.g., DataCamp, Udemy)
- Prod: Various
- Usage: While not a single certification, any reputable SQL certification demonstrates your proficiency in querying databases, which is fundamental to this job.
Recommended Activities
- Completing online courses in advanced SQL, Python for data analysis, or data visualisation best practices (e.g., on Coursera, DataCamp, Udemy).
- Attending industry webinars or virtual meetups focused on data analytics or international business trends. It's a great way to learn and network.
- Building a personal portfolio of data projects (even small ones!) that showcase your skills in SQL, Python, or Tableau. This really helps us see your practical abilities.
- Reading books or articles on data ethics, data governance, or specific international market dynamics to broaden your understanding beyond just the technical.
Career Progression Pathways
Entry Paths to This Role
- Path: Graduate Scheme / Internship Programme
- Time: 6-12 months
- Path: Internal Transfer (e.g., from Customer Support, Operations)
- Time: 1-2 years (with prior data exposure)
- Path: Junior Data Role in a Smaller Company
- Time: 1-2 years
Career Progression From This Role
- Pathway: International Analytics Specialist (Level 2)
- Time: 2-3 years in current role
Long Term Vision Potential Roles
- Title: Senior International Analyst (Level 3)
- Time: 5-8 years from entry
- Title: Lead International Analyst / Staff Analyst (International Strategy) (Level 4)
- Time: 8-12 years from entry
- Title: International Analytics Manager (Level 5)
- Time: 12-16 years from entry
Sector Mobility
The skills you'll gain here—SQL, Python, data visualisation, understanding international business, and problem-solving—are highly transferable. You could move into broader data science roles, product analytics, business intelligence in other global companies, or even specialise in a specific industry like fintech or e-commerce. The world is your oyster, data-wise.
How Zavmo Delivers This Role's Development
DISCOVER Phase: Skills Gap Analysis
Zavmo maps your current competencies against all requirements in this job description through conversational assessment. We evaluate your foundation skills (communication, strategic thinking), functional skills (CRM expertise, negotiation), and readiness for career progression.
Output: Personalised skills gap heat map showing strengths and priorities, estimated time to competency, neurodiversity accommodations.
DISCUSS Phase: Personalised Learning Pathway
Based on your DISCOVER results, Zavmo creates a personalised learning plan prioritised by impact: foundation skills first, then functional skills. We adapt to your learning style, pace, and neurodiversity needs (ADHD, dyslexia, autism).
Output: Week-by-week schedule, each module linked to specific job responsibilities, checkpoints and milestones.
DELIVER Phase: Conversational Learning
Learn through conversation, not boring modules. Zavmo uses 10 conversation types (Socratic dialogue, role-play, coaching, case studies) to build competence. Practice difficult QBR presentations, negotiate tough renewals, and handle churn conversations in a safe AI environment before facing real clients.
Example: "For 'Stakeholder Mapping', Zavmo will guide you through analysing a complex enterprise account, identifying key decision-makers, and building an engagement strategy."
DEMONSTRATE Phase: Competency Assessment
Zavmo automatically builds your evidence portfolio as you learn. Every conversation, practice scenario, and application example is captured and mapped to NOS performance criteria. When ready, your portfolio supports OFQUAL qualification claims and demonstrates competence to employers.
Output: Competency matrix, evidence portfolio (downloadable), qualification readiness, career progression score.