Lupr is currently under development.

Feedback management

Everything about collecting, managing, and acting on feedback in Lupr.

Feedback lifecycle

Every piece of feedback follows a structured flow from invite to submission. This ensures reviewers understand the confidentiality expectations before they see your content, and that you receive consistent, structured responses.

1

Invite sent

The project owner sends a unique, secure invite link to a reviewer via email.

2

Reviewer previews pitch

The reviewer sees the project title, summary, and protection level. No sensitive content is revealed yet.

3

Accepts confidentiality

Depending on the protection level, the reviewer signs a pledge, built-in NDA, or custom NDA before proceeding.

4

Views pitch (watermarked)

The reviewer reads the full pitch content. Every page is overlaid with a watermark containing their email and a timestamp.

5

Submits structured feedback

The reviewer rates the idea, answers custom questions, and leaves written feedback. All responses are tied to their identity.

Watermarking: When a reviewer views your pitch, the content is overlaid with their email address and a timestamp. This creates a traceable record if content is ever shared without permission. Watermarks are applied at render time and cannot be removed by the viewer.

Reviewer experience

Here is what each step looks like from the reviewer's perspective. Understanding this flow helps you craft better pitches and set appropriate protection levels.

Preview page

The reviewer lands on a preview page showing the pitch summary, the protection type (pledge, NDA, or none), and an accept button. No detailed content is visible at this stage, giving the reviewer a chance to decide whether to proceed.

Acceptance

If the project has protection enabled, the reviewer sees the full pledge text or a rendered NDA document. They must type their name as a signature and accept the terms before the content is unlocked.

Gated view

After acceptance, the full pitch content is revealed with a watermark overlay. The watermark displays the reviewer's email and current timestamp across the content, ensuring traceability. The reviewer can read all sections including problem, solution, differentiation, and risks.

Feedback form

The structured feedback form includes the following fields:

Star rating: 1-5 stars rating the overall idea

Would you use this?: Three options: Yes, Maybe, or No (used to calculate the validation score)

Custom questions: Project-specific questions set by the owner

Free text: Open-ended area for detailed written feedback

Managing feedback

As a project owner, you have full control over how feedback is organized, triaged, and communicated back to reviewers.

Feedback list

All feedback for a project is shown in a single list view. You can sort by date, rating, or vote count, and filter by status label or tag. Each entry shows the reviewer name, star rating, response summary, status, tags, and vote count.

Status labels

Assign a status to each piece of feedback to track how it is being handled. Statuses are visible to reviewers in the public feedback widget.

StatusBadgeWhen to use
OpenOpenNew feedback that has not been reviewed yet
Under considerationUnder considerationBeing evaluated by the team
PlannedPlannedAccepted and added to the roadmap
In progressIn progressActively being worked on
AddressedAddressedThe feedback has been resolved or shipped
Not nowNot nowAcknowledged but deferred to a later time
Thank youThank youAppreciation for the feedback, no action needed

Owner responses

You can write a public response to any piece of feedback. Responses are visible in the embedded feedback widget, so reviewers and other visitors can see how you are addressing their input. Use responses to ask clarifying questions, share updates, or thank reviewers.

Tagging

Add free-form tags to any feedback entry to organize responses by theme, feature area, or priority. Tags are searchable and can be used as filters in the feedback list.

Sorting & filtering

Sort feedback by date, rating, or vote count. Filter by status label, tag, or visibility. Combine filters to narrow down to exactly the feedback you need.

Visibility controls

Each feedback entry can be set to public (shown in the embedded widget for all visitors) or private (only visible to the project owner). Default visibility is configurable per project.

CSV export

Export all feedback to CSV for offline analysis, sharing with stakeholders, or importing into other tools. The export includes ratings, responses, status, tags, and timestamps.

Voting

The embedded feedback widget supports public upvote and downvote buttons on each feedback entry. Voting lets your audience signal which feedback matters most, surfacing the highest-priority items for your team.

Upvotes

Visitors can upvote feedback they agree with. Higher upvote counts push entries toward the top of the widget when sorted by votes.

Downvotes

Downvotes signal disagreement. The net vote count (upvotes minus downvotes) determines the ranking position in the widget.

Ranking: In the embedded widget, feedback entries are ranked by net vote count (upvotes minus downvotes) by default. Owners can switch to sort by date or rating instead.

Validation score

The Validation Score measures how strongly reviewers want your product. It is calculated from the "Would you use this?" question that every reviewer answers as part of the feedback form.

How it works

Each reviewer answers "Would you use this?" with one of three options: Yes, Maybe, or No. The Validation Score is the percentage of respondents who answered "Yes":

Validation Score = (Number of "Yes" responses / Total responses) × 100

Only reviewers who submitted feedback are counted. Pending invites and incomplete responses are excluded from the calculation.

Score rangeInterpretation
40%+Strong user demand. A significant portion of reviewers would use the product.
25% - 39%Moderate signal. There is interest, but the product may need refinement to strengthen demand.
Below 25%Weak signal. Consider pivoting, narrowing the audience, or revisiting the core value proposition.

Tip: The 40% threshold comes from Sean Ellis's survey methodology for measuring user demand. If 40% or more of your users say they would be "very disappointed" without the product, you likely have strong demand. Lupr adapts this by using the "Would you use this?" signal as a proxy, displayed as a Validation Score.

What's next?

Now that you understand how feedback works, explore these related features: