Skip to policy content
    Skip to main content
    inyourroots®
    • About

    1. Policies
    2. →
    3. inyourroots® Fairness, Inclusion & AI Statement

    inyourroots® Fairness, Inclusion & AI Statement

    What this policy covers: This statement explains why inyourroots® exists, how we reduce bias and unfair barriers, and how we support accessibility and responsible AI use.

    Last updated: 18 February 2026

    Version: 1.0

    Quick summary

    • inyourroots® is built to reduce bias and remove unfair barriers from CV-first recruitment.
    • AI helps guide and explain recommendations, but it should not decide your future.
    • We do not use protected characteristics to decide what opportunities you can see.
    • If something feels unfair or inaccessible, report it and we will investigate.
    • Contact for concerns: support@inyourroots.com.

    Why inyourroots® exists

    A lot of recruitment starts with a CV. And honestly, that can be unfair.

    CVs can reward people who have had more chances, more support, or who just know how to “sound right” on paper, not necessarily the people with the most potential.

    inyourroots® is built to reduce bias and remove unfair barriers by helping you show who you are in a more human way, starting with your strengths, your preferences, and what you are like at your best.

    What we mean by “AI” on inyourroots®

    When we say “AI”, we mean technology that helps us:

    • Guide you through strengths discovery.
    • Suggest industries and roles that could suit you.
    • Explain recommendations in plain English.

    AI can help organise information and spot patterns, but it shouldn't decide your future. You stay in control.

    Our commitment to fairness

    We are committed to giving young people fair, inclusive career guidance.

    That means we work to:

    • Reduce bias that can show up in traditional recruitment.
    • Design for inclusion, including neurodivergent-friendly choices.
    • Make reasonable adjustments so the platform is usable for people with disabilities.
    • Be transparent about how recommendations are made.
    • Listen and improve when users tell us something feels unfair.

    What we do (and don't) use to make recommendations

    Strengths-first, not background-first. Recommendations are based on things like your strengths, interests, preferences, and what environments suit you.

    We do not use protected characteristics (like gender, race, religion, or disability) to decide what opportunities you can see.

    If we ever ask optional questions that relate to protected characteristics (for example, to help us check fairness), you will always have a “Prefer not to say” option, and it will not block you from using the platform.

    How we reduce bias in practice

    Bias can creep into any system, especially when it relies on data, wording, or assumptions. We actively work to reduce it by:

    • Reviewing our questions and wording to avoid assumptions (for example, about experience, resources, or confidence).
    • Checking our role and industry information for gaps or outdated content.
    • Testing recommendations to make sure similar answers lead to similar types of suggestions.
    • Watching for biased language in employer job posts and encouraging fairer wording.

    Accessibility (reasonable adjustments)

    We want inyourroots® to be usable for as many people as possible.

    That includes designing for accessibility, such as:

    • Clear navigation and readable layouts.
    • Compatibility with assistive technologies (like screen readers).
    • Options that support different needs and preferences (for example, reducing motion).

    If you are struggling to use any part of the platform, please tell us and we will do our best to help.

    How to raise a fairness or accessibility concern

    If something feels unfair, confusing, or inaccessible, we want to know.

    When you contact us, it helps if you include:

    • What happened (and what you expected to happen).
    • Screenshots (if you can).
    • The part of the platform you were using.
    • Anything you think might be relevant.

    What we will do next:

    1. Log your report.
    2. Investigate what happened.
    3. Fix issues where we can.
    4. Tell you what we have done (where appropriate).

    Contact:support@inyourroots.com

    FAQ

    No. We do not use protected characteristics to decide what opportunities you can see.

    Only what is needed to run the platform. If we ask optional questions, you can choose “Prefer not to say”.

    Yes. You can update your answers so recommendations stay relevant.

    Tell us. Your feedback helps us improve and spot issues we might miss.

    Closing note

    inyourroots® is built to help you be seen for who you are, not judged by a CV template. We will keep working to reduce bias, improve accessibility, and explain our tech in a way that makes sense.

    Questions or concerns?

    If anything here feels unclear, tell us. We'll help.

    Contact us about this policy →

    On this page

    • Why inyourroots® exists
    • What we mean by “AI” on inyourroots®
    • Our commitment to fairness
    • What we do (and don't) use to make recommendations
    • How we reduce bias in practice
    • Accessibility (reasonable adjustments)
    • How to raise a fairness or accessibility concern
    • FAQ
    • Closing note
    Need help?

    inyourroots®

    Nationwide access, local roots. We're launching matching across Essex, Hertfordshire, and Suffolk. Expanding as we grow.

    Explore

    • Young People
    • Employers
    • Parents
    • Educators
    • About

    Tools & Resources

    • Content Hub
    • Sprout
    • inyourroots® ecosystem
    • Policies Hub
    • Contact
    © 2026 inyourroots®. All rights reserved.