The digital 360 review process: A guide for consultants and HR professionals to 360 assessment and actionable reports

Written March 27, 2025, by Jeroen De Rore

It’s 9 p.m. Jordan, a strategic HR consultant juggling three customer projects, is still at his laptop, eyes bleary from copying feedback into yet another PowerPoint deck. Normally, he’s good at what he does, but the next morning his manual grind will be exposed – as he mixes up the feedback from two different Monicas at two different companies, and attributes a junior manager’s comment on poor leadership to the CEO…

Ouch. So much for trying to bootstrap his consultancy firm. The good news? The solution isn’t so difficult to set up. Let’s discuss how the 360 review process is done right:

What is a 360 review process and why is it powerful?

If you’ve never ever heard of a 360 review process or 360 assessments, quickly check out this introductory video by my colleague, Stacy, who explains it very well.

If you’re a consultant like Jordan, or an HR professional in charge of L&D, you know feedback is gold – but only if it’s structured, complete, and easy to interpret. Surface-level feedback leads to surface-level change and nothing more.

A well-run 360 review process uncovers blind spots and gaps, validates strengths, and sparks real conversations and action. For a consultant, it’s the kind of insight that turns vague recommendations, embellished on a few PowerPoint slides – or worse, dumped in a forest of pages with webs of words that nobody understands – into a targeted development plan. Ultimately, it’s this plan that will make your work a lot more valuable.

Key steps of the 360 review process

Unlike traditional top-down or bottom-up feedback, a 360 review gathers input from every direction. Generally, it starts with a self-assessment by the person you advise. 

What follows is then a set of assessments from different points-of-view: employees, managers, and other stakeholders. 

Of course, you don’t then just pass on all the information you gather in its raw form. You summarize it in a way that is digestible and actionable. Let’s break down the process and highlight some important tips.

360 review process

As an example here, we’ll take the classical case of an employee who is being reviewed, in order to get a personal development plan. 

1. Why is self-assessment important?

Progress for any employee, on any level in an organization, starts with self-awareness. Asking the person to self-reflect gives you a baseline: how they think they’re doing. The true self-awareness of course will come later, as you’ll compare the way others perceive them. 

Ultimately, the delta between these two is what you’ll want to understand. So, a good self-assessment sets the stage for real insight and allows for side-by-side analysis with more information you’ll collect during the 360 review process. 

2. How to select the right 360 reviewers?

Once the self-assessment is complete, it’s time to invite the right mix of reviewers. For instance:

  • Peers
  • Direct reports 
  • Managers 
  • Customers or even other external stakeholders, like suppliers or partners

You should consider anyone who can speak to the individual’s performance from a unique angle. The power of a 360 lies in the diversity of perspectives. Be transparent with participants about the process. Make it clear that their feedback is confidential and meant for development, not judgment. 

So how do you pick the right 360 reviewers? 

It’s a more strategic question than it may seem. Yes, you might have an idea of who works closely with the person. But involving the individual in selecting their reviewers themselves can be surprisingly effective.

Why? Because the person is more likely to perceive the 360 review process as fair, constructive, and collaborative. As a result, they’re much more likely to take the ultimate feedback to heart and have the self-determination to improve. 

That doesn’t mean the 360 feedback can’t be objective. It does mean you’ll have to carefully consider what 360 feedback tool and 360 feedback questions you’ll put in place. But we’ll get to that. 

It also doesn’t mean you have to give away total control over who gets to assess whom. You can create a fair shortlist for the person being assessed by letting them answer the following types of questions:

360 review process - How to shortlist the right reviewers

Consider this way of selecting 360 reviewers. The key message you’ll convey to the person being assessed: “This 360 review process is for you, more than it is about you.”

3. Execution of diverse assessments needed for a solid 360 analysis

To build a meaningful 360 analysis, it’s not just important to consider who gives feedback. In the end what matters most of all is what they’re assessing. A thoughtful 360 review process goes beyond basic observations and captures multiple dimensions. 

These will differ from situation to situation. For example, insights into someone’s coaching and mentoring skills are relevant when you’re assessing their leadership potential.
However, for a sales representative, the dimensions you’ll care about will more likely be about negotiation skills (well duh), but you may also do resilience assessment, and evaluate their product knowledge.

So how do you determine what to include in your 360 analysis?

One way is to look at it through the lens of a venn diagram. Find out what the priorities are, on both the individual and organizational level. Is there any overlap? Then be sure to treat the dimension that falls into the intersection zone as a priority for the 360 review.

How to decide on priorities in the 360 review process

How do you decide whether or not to include what falls outside of the intersection zone?

In short, ask yourself this question: “Will the assessment support an upcoming responsibility, challenge, or known gap?” 

You may have several sales representatives whose current focus is very strongly on signing new deals. They aren’t in a managerial position, so coaching skills are not currently relevant – but they may become relevant soon enough – as many of the current managers are nearing an age where they could retire. In this case, assessing coaching skills may be a good idea.

Now, those coaching skills could be assessed in several ways:

  • You’ll want the self-assessment angle as a baseline.
  • You could let the person undergo an actual skill assessment to gauge their coaching potential
  • On top, you could have peers assess their perception of the person’s coaching potential.

4. Collecting the data and turning it into a development plan: Digitize!

This all sounds like a lot of data to gather, doesn’t it? At this point, you’re perfectly entitled to wonder how you will process all that information.

The answer? Digitization. The higher the diversity of assessments in your 360 review process, the more important it becomes to digitize the process so the data can be structured and processed effectively. 

Why? Because that’s how you’ll turn data into an action plan for the assessed individual to make progress. We’ll get to this in the section about how to choose the right 360 feedback tool

But first I shortly want to share a few 360 assessment models, I’ve come across during countless discussions with consultants and professional development coaches.

 

Want to exchange ideas?​

Ever since I joined the Pointepro team, I’ve spoken with almost a hundred of users of our platform (of all generations) who courageously and successfully digitalize (part) of their consultancy. If you’re on the fence about making the online move, I’d love to convince you.​

Get my two cents

Inspiring models for 360 analysis and 360 feedback examples

I’ll summarize a few 360 analysis models or frameworks below and give you my (humble) opinion on when I deem they’re useful – but also when they’re not.

Brace yourself. These models are vast. The reason I want to discuss them is to inspire you to develop your own 360 analysis model – with which you’ll emphasize your personal expertise.  

360 leadership assessment example: The Direction – Alignment – Commitment (DAC)™ Framework

The DAC™ model – one of the leadership models used by the Center for Creative Leadership – takes a relational approach to leadership. Instead of focusing on individual traits or positional authority, DAC shifts attention to what leadership produces within a group or organization: shared Direction, clear Alignment, and genuine Commitment. 

When to use:
The DAC™ framework is great when you’re assessing people who play key roles in shaping team dynamics. It’s particularly useful in cross-functional teams, agile organizations, and in leadership development programs where the goal is to build influence, not just individual performance. It helps surface how someone contributes to a team’s ability to move together with clarity and energy.

When not to use:
DAC™ may not be the best fit when you’re working with individual contributors who have little influence over team dynamics, or in organizations where leadership is still viewed in more traditional, top-down terms. If the broader environment doesn’t yet value distributed or collective leadership, this framework might feel disconnected from how success is measured on the ground.

360 sales assessment example: Richardson Sales Competency Framework

Richardson’s Sales Capability Framework offers a structured, behavior-based model to assess and develop sales professionals across every phase of the selling process

It’s an ideal 360 feedback tool in sales roles because it breaks down sales excellence into specific, observable capabilities – not abstract traits. This makes it highly suitable to develop an action plan.

The framework defines 16 sales capabilities, supported by 58 behaviors, organized across three categories:

  • Sales methods: Capabilities that help salespeople meet standards and use tools effectively, such as forecasting accurately, managing a pipeline, and leveraging RevTech stacks.
  • Sales motions: Capabilities for creating, capturing, and growing business. This includes prospect messaging, diagnosing customer pain points, and negotiating with value.

Sales meetings: Capabilities that focus on engaging buyers during conversations, such as storytelling, solution recommendation, and objection handling.

Richardsons Sales Framework

When to use:
This framework is most useful when assessing customer-facing professionals in B2B or consultative sales roles. It helps sales managers, coaches, and consultants identify specific behaviors tied to business metrics like deal velocity or win rates. It also allows for more personalized feedback, instead of one-size-fits-all reviews.

When not to use:
The framework is probably overbuilt for high-volume transactional sales environments, such as outbound call centers or low-touch eCommerce. In those contexts, agility and repetition matter more than nuanced capabilities like value positioning or stakeholder alignment.

Universal 360 assessment example: SHL Universal Competency Framework (UCF)

The SHL Universal Competency Framework (UCF) is a versatile, research-backed model designed to assess behavioral competencies across roles, industries, and career levels. Unlike leadership-specific models, the UCF is broad and flexible, making it especially useful for 360 feedback in non-leadership roles – from operations to administration, technical roles, and early-career talent.

At its core, the framework organizes competencies into eight overarching performance domains. Things like “Creating & Conceptualising”, “Interacting & Presenting”, and “Supporting & Cooperating.” 

Each domain includes more granular competency dimensions (20 in total), like Adapting and Coping, Analyzing, or Following Instructions. These are then further broken down into 96 observable behaviors – making the framework highly practical for designing questionnaire-based assessments.

When to use:
The strength of the UCF lies in its universality and scalability. For consultants or HR professionals running a 360 review process, this means you can tailor the assessment to the individual’s role – whether they’re a customer service rep, analyst, or junior team member – while maintaining consistency across the company.

When not to use:
As a copywriter in Pointerpro’s creative marketing team, I must admit I feel the UCF may be less effective in highly creative, fluid environments where predefined behavioral standards feel restrictive.

 

What is the right 360 feedback tool for you? From basic to advanced setup.

But enough theory. Let’s assume you have chosen or developed the 360 review framework that fits your goals. 

Now it’s time to bring it to life. And as I mentioned earlier, you’ll be gathering a lot of data that you’ll need to process and turn into action plans. So, your best bet is to use a digital 360 feedback tool.

Let’s go from a basic setup to a fully professional setup you can have. I assume you’re most likely a Microsoft 365 or Google Workplace user, so I’ll address some tools in these software stacks you probably already have access to. Then we’ll look at an alternative, more focused solution: 360 feedback software.

Beginner setup: Microsoft Forms or Google Forms for 360 degree feedback

If you’re looking for a simple, no-cost way to manage a 360 review process, tools like Google Forms or Microsoft Forms can be an effective starting point. They let you create your own questions, send them to selected reviewers, and gather feedback in a structured way – all without needing any real tech savviness.

How to set up a 360 review process with Forms (in summary)

  • Create a new form and add your 360 questions
  • Share the form with selected reviewers via email 
  • Collect responses in the (automatically) connected spreadsheets
  • Manually analyze and compile results into a feedback report
Example of a Google Form

When to use:
This is most ideal when you’re running a small, low-stakes pilot to put your own new 360 framework to the test. In other words, when you’re working with a handful of participants and reviewers. It’s great because there’s some flexibility and automation in the process and your setup cost is basically included in your Microsoft or Google subscription. 

Key limitation:
This practice is basically a data dump (pardon my French). Everything after data collection is manual work… unless you go for the medium set-up below.

Medium setup: Microsoft or Google spreadsheets for 360 degree feedback

Spreadsheets give you a central command center to manage your entire 360 review process – not just gather responses.

By using them as your central tool, you’re not just collecting data. You’re organizing the process: assigning reviewers, tracking who’s completed their feedback, categorizing responses, and spotting patterns. You can use conditional formatting to flag outliers, formulas to calculate average scores per competency, and tabs to break out different groups or participants. In short: it turns your 360 review process into a manageable project.

How to set up a 360 review process with spreadsheets (in summary):

  • Build a multi-tabbed spreadsheet to manage participant lists, reviewer assignments, response status, and raw feedback
  • Design your own scoring summaries using spreadsheet (e.g. average) 
  • Add visual cues (like heatmaps) to highlight strengths and gaps across competencies
  • Use comment sections to manually annotate key insights and action points

You’ll still choose to collect response data through a linked form that feeds into your sheet. Of course, you can also copy-paste responses from email or interviews.

360 review spreadsheet example (1)

When to use:
A spreadsheet-centered setup is ideal when you’re running several 360 reviews at once and need visibility across participants. They work especially well if you want to experiment further with your own analysis methods, monitor who has submitted feedback and involve a higher number of reviewers per assessed employee. 

Key limitations:

  • Everything after data processing and dashboarding is manual. Your spreadsheet setup can provide you with a solid analysis and even visualize charts. However, translating the findings into an actual feedback report for each respondent is entirely up to you.

Advanced setup: 360 feedback software

If you’re ready to take your 360 review processes to a fully professional level – with freedom to automate and scale – you should go for all-in-one assessment software. In other words, a platform that combines questionnaire building and distribution, data collection, dashboarding, report creation – and report delivery to stakeholders.

360 assessment blog article (14)

Sounds complex? If you have done the medium setup, you’ve actually done much of your homework. You’ll set up the same question logic, formulas and calculations, but the key differentiator lies in 360 feedback reporting.

Based on variable outcomes and scores of individual or combined review domains, you’ll be able to generate tailored feedback to any respondent, even in a full blown PDF format. You’ll be able to do so, based on the aggregate response data from the various respondents..

15 (1)

How to set up a 360 review process with assessment software (in summary):

  • Start by building your 360 questionnaire using the platform’s editor 
  • Define scoring logic and categorize questions by domain
  • Define personalized feedback messages to be auto-generated in the assessed employee’s PDF report, based on conditions like aggregate scores.
  • Design your report template – including charts, written feedback, visuals, and branding
  • Launch the distribution, collect responses, and let the software generate reports – all from within the platform

Using 360 feedback software: Best practices from questionnaires to 360 review report

I’d like to close with some best practices to get to the best and most professional end product for your business, which essentially consists of the questionnaire you’ll distribute and the 360 report your end user gets.

360 feedback questionnaire guidelines

Align questions to your framework

I know. This is stating the obvious. But you’d be surprised at how quickly you could lose yourself in creating questions that aren’t actually measuring what your framework requires it to measure. 

Start with your 360 framework – whether it evaluates leadership behaviors, management skills, business knowledge or all of them combined – and come up with questions that directly reflect those dimensions.

Diversify your question types (but not too much)

In a 360 review, questions don’t need to be open-ended to be insightful. In fact, structured formats are your best friends when it comes to getting consistent, easy-to-analyze feedback.

The two most common question types we see Pointerpro users apply:

  • Likert scale questions (e.g. “Strongly disagree” to “Strongly agree”) are perfect for assessing behaviors across a range. They make it easy to compare perceptions across reviewer groups and to track progress over time.

  • Multiple choice questions are useful when you want to force a choice or help respondents reflect more intentionally. To ensure an engaging respondent experience, you could also use visuals to support your multiple choice options.  

 

360 assessment question types

But don’t overdo it. Diversifying too much can confuse reviewers and make your data harder to compare. Stick to a consistent format within each section or competency to keep the experience smooth, and the insights clean. The sweet spot is probably 2 or 3 question types.

Customize 360 feedback questions and answer options to the reviewer type (advanced tip)

Peers, direct reports and customers all have a different relationship with the person you’re assessing in the 360 review process. 
To measure the same thing, you might actually need to speak their specific language in your questionnaire. In other words, you may want to phrase questions and answer options differently.  This implies you’ll actually create distinct questionnaires from which you’ll later use the data to generate aggregated scores.

An example of a multiple choice question to gauge a person’s written communication skills – and an answer option that would earn the person, say, 2 points out of a possible of 4 for the best answer option:

  • Peer questionnaire: ““When this person writes emails or Slack messages, how clear are they? ⇒ Option B: ”Their messages are sometimes unclear or need follow-up clarification.
  • Direct report questionnaire: “How easy is it to understand this person’s written instructions or guidance?” ⇒ Option B: Sometimes the message lacks clarity, and we need to speak face-to-face before I can proceed.
  • Customer feedback questionnaire: “How would you rate the clarity of this person’s written communication with you?”⇒  Option B: Some emails or proposals were unclear and required follow-up.

Keep the questionnaires short but meaningful (thanks to question logic)

Nobody enjoys answering 60 questions – especially if they’re reviewing multiple people. Aim for a review that takes 10-15 minutes tops. Typically for purposeful assessments like these, 20-25 questions max is a sweet spot for any respondent.

A great trick to use your respondents’ willingness to participate to the max?
Applying question logic – often referred to as survey logic. Here’s another video by my colleague, Stacy, who explains it yet again very clearly.

The report: Actionable 360 feedback examples

Obviously, a 360 review process has little to no use if there is no clear action plan that follows. And just like with Jordan, the HR consultant from our intro, this is where things tend to go off the rails for many – even when the 360 feedback has been efficiently collected.  
That’s why assessment software like Pointerpro automates the 360 review process all the way up to fully personalized PDF report generation. 

In an individual assessment scenario – if you created your report template with conditional feedback, using the “individual report” option – a questionnaire respondent receives an individual report, with the click of a download button at the end. 

360 assessment blog article (14)

In a typical 360 review scenario, you’ve distributed a questionnaire to collect the insights from multiple reviewers about one person. In 360 assessment software like Pointerpro, you therefore choose the “Group report” option.

Group Report type choice in Pointerpro

360 feedback example #1: Start with a “How to read this report”

We all know what it can feel like to receive feedback. It’s not always easy to interpret. Also, a 360 review can reveal some really confronting information. So, before diving into scores and comments, open the report with a short guide that explains what the reader will be looking at. This builds trust, reduces confusion, and helps the recipient approach the feedback with the right mindset.

360 assessment blog article (7)

Wow prospects with Pointerpro-built automated reports​

 

Here’s a quick introduction on how Pointerpro works, brought to you by one of our product experts, Chris.

This is what clients say about us:​

360 feedback example #2: Insert curated key comments per review domain

A 360 review is powerful because it’s an aggregate report that uses response data from different people to quantify average appreciations in different domains. 

But you don’t want to lose the personal touch by only discussing the numbers. Therefore, it’s good to leave some space for some open-ended comments the reviewers typed up themselves.

If the 360 review touches on various domains, it’s useful to start each section that discusses the domain with this. The way to get there is by inserting an open-ended question at the end of every key section in the questionnaire. 

360 assessment blog article (9) (1)

360 feedback example #3: Visualize alignments and gaps

I mentioned the importance of using self-assessment as a baseline. Comparing the perceptions of others with your self-perceptions is highly valuable. It’s therefore also an essential element to visualize. A spider chart tends to be the weapon of choice for consultants and companies that use Pointerpro for their 360 review process.

360 assessment blog article (12)

360 feedback example #4: Formulate an action plan

“What now?” That’s the question you want to dodge like the plague.
At the end of the report, you want to formulate concrete actions for the reader to take. The more concise and concrete the better.

360 assessment blog article (13)

PS. To be sure you’re consistent in the actions you attribute to the people who are assessed, of course you’ll apply scoring formulas and conditional rules – as in many other places in your report

360 assessment blog article (15)

Want to learn more about how to generate auto-personalized 360 reports?

I’m aware this article is a vast introduction to digitalizing your 360 review process – and 360 feedback software is possibly very new to you (our platform is pretty much the only one out there to do what it does) . 

That’s why new Pointerpro users don’t simply subscribe to our tool and are then left to find things out for themselves. We always start with an introductory call to understand your specific needs, then plan a personalized demo, and when you decide to sign on you get spar with our onboarding specialists during 1-3 sessions. 

Get started by asking for a live introduction here

Create your own assessment
for free!

Recommended Reading

Want to know more?

Subscribe to our newsletter and get hand-picked articles directly to your inbox

Please wait..
Your submission was successful!

About the author:

Jeroen De Rore

As Creative Copywriter at Pointerpro, Jeroen thinks and writes about the challenges professional service providers find on their paths. He is a tech optimist with a taste for nostalgia and storytelling.