Comparing Design Variants
InForm's comparison tools allow you to evaluate different design alternatives side-by-side, helping you make informed decisions based on multiple criteria.
Overview of Comparison Features​
Types of Comparisons​
- Side-by-side visualization: View multiple designs simultaneously
- Performance comparison: Compare metrics and objectives
- Parameter comparison: Understand input differences
- Visual differencing: Highlight geometric differences
Comparison Views​
- Split-screen 3D: Two or more 3D models side by side
- Tabular comparison: Structured data comparison
- Chart overlays: Performance metrics on shared charts
- Difference visualization: Highlight changes between variants
Setting Up Comparisons​
Selecting Variants​
From Parameter Space​
- Navigate to the Parameter Space view
- Select multiple points using Ctrl+click or box selection
- Right-click and choose "Compare Selected"
- Wait for the comparison view to load
From Saved Variants​
- Go to your Saved Variants library
- Check the boxes next to variants you want to compare
- Click the "Compare" button
- Choose your preferred comparison layout
From Project History​
- Access your exploration history
- Select configurations from different time points
- Use "Add to Comparison" for each variant
- Launch the comparison view
Comparison Setup​
Choosing Layout​
- 2-way split: Side-by-side comparison of two variants
- 3-way comparison: Three variants in a grid layout
- 4+ way grid: Multiple variants in a grid arrangement
- Sequential: Step through variants one at a time
Synchronization Options​
- Synchronized views: Camera and zoom levels linked
- Independent views: Each variant has its own controls
- Synchronized parameters: Adjust all variants together
- Fixed reference: Keep one variant as a baseline
Visual Comparison Tools​
3D Model Comparison​
Side-by-Side Viewing​
- Synchronized rotation: Move all views together
- Independent control: Examine each design separately
- Zoom synchronization: Maintain consistent scale
- Overlay transparency: See designs superimposed
Difference Visualization​
- Geometric differencing: Highlight shape changes
- Color coding: Show areas of difference
- Displacement vectors: Show movement of elements
- Before/after animation: Animated transitions between variants
Performance Visualization​
Metric Comparison Charts​
- Bar charts: Compare scalar values across variants
- Radar charts: Multi-dimensional performance comparison
- Line charts: Show performance trends
- Scatter plots: Compare two metrics simultaneously
Relative Performance​
- Percentage differences: Show relative improvements
- Normalized scales: Compare different units fairly
- Baseline comparison: Show differences from a reference
- Ranking: Order variants by performance
Analysis Techniques​
Quantitative Analysis​
Performance Metrics​
- Identify key metrics for your decision
- Set up comparison tables with all relevant values
- Calculate differences and percentage changes
- Rank variants by each metric
- Look for trade-offs between competing objectives
Statistical Comparison​
- Significance testing: Determine if differences are meaningful
- Confidence intervals: Understand uncertainty in results
- Sensitivity analysis: How robust are the differences?
- Monte Carlo comparison: Account for uncertainty
Qualitative Analysis​
Visual Assessment​
- Aesthetic evaluation: Which design looks better?
- Functional assessment: Does the design meet requirements?
- Usability considerations: How will users interact with it?
- Stakeholder preferences: What do decision-makers prefer?
Design Intent​
- Alignment with goals: Which variant best meets objectives?
- Design philosophy: Which approach is more consistent?
- Innovation level: Which variant is more creative or novel?
- Risk assessment: Which variant has fewer unknowns?
Advanced Comparison Features​
Multi-Criteria Decision Analysis​
Weighted Scoring​
- Define criteria and their relative importance
- Assign weights to each criterion (totaling 100%)
- Score each variant on each criterion
- Calculate weighted totals for overall ranking
- Perform sensitivity analysis on weights
Pareto Analysis​
- Identify non-dominated solutions: No variant is better in all criteria
- Visualize trade-offs: Understand what you give up and gain
- Preference articulation: Express stakeholder priorities
- Compromise solutions: Find balanced alternatives
Scenario Analysis​
Performance Under Different Conditions​
- Define scenarios: Different operating conditions or requirements
- Evaluate variants: How does each perform under each scenario?
- Robustness assessment: Which variant is most consistent?
- Risk evaluation: Which variant has the worst downside?
Sensitivity to Assumptions​
- Parameter uncertainty: How sensitive are results to input assumptions?
- Model uncertainty: How much do analysis methods matter?
- Future conditions: How might performance change over time?
Collaboration in Comparison​
Team-Based Evaluation​
Structured Reviews​
- Prepare comparison materials with clear criteria
- Schedule review sessions with relevant stakeholders
- Present variants systematically using comparison tools
- Document feedback and preferences from each reviewer
- Synthesize input into recommendation
Distributed Review​
- Share comparison links: Allow remote review
- Collect feedback forms: Structured input from stakeholders
- Anonymous voting: Reduce bias in evaluation
- Comment and annotation: Allow detailed feedback
Decision Documentation​
Comparison Reports​
- Executive summary: Key findings and recommendations
- Detailed comparison: Full analysis with all metrics
- Visual documentation: Screenshots and charts
- Rationale: Why the recommended variant was chosen
Decision Tracking​
- Decision points: Record when and how decisions were made
- Stakeholder input: Who provided what input?
- Criteria evolution: How did evaluation criteria change?
- Lessons learned: What would you do differently?
Best Practices​
Effective Comparison Setup​
Choosing Variants​
- Meaningful differences: Don't compare nearly identical variants
- Representative range: Include diverse approaches
- Baseline inclusion: Always include a reference design
- Constraint satisfaction: Ensure all variants are feasible
Fair Comparison​
- Common ground: Use same analysis methods for all variants
- Normalized metrics: Ensure fair comparison across different scales
- Complete data: Don't compare variants with missing information
- Consistent assumptions: Use same boundary conditions
Analysis Best Practices​
Systematic Evaluation​
- Define criteria first: Know what you're evaluating before you start
- Use multiple perspectives: Technical, economic, aesthetic, etc.
- Consider all stakeholders: Who will be affected by this decision?
- Document assumptions: Record what you assumed and why
Avoiding Bias​
- Blind evaluation: Hide variant names during initial assessment
- Multiple evaluators: Get different perspectives
- Structured methods: Use systematic approaches, not just intuition
- Devil's advocate: Actively look for problems with preferred options
Common Comparison Scenarios​
Design Optimization​
- Performance optimization: Which variant performs best?
- Cost optimization: Which variant provides best value?
- Risk minimization: Which variant has least downside?
- Robustness: Which variant works well across conditions?
Design Evolution​
- Incremental improvements: How much better is the new version?
- Alternative approaches: Fundamentally different design strategies
- Technology comparison: Different technical solutions
- Scale comparison: How does performance scale with size?
Stakeholder Evaluation​
- Client preferences: Which variant do clients prefer?
- User experience: Which variant is easier to use?
- Maintenance considerations: Which variant is easier to maintain?
- Future adaptability: Which variant can evolve better?
Troubleshooting​
Visualization Issues​
- Too many variants: Limit to 4-6 for effective comparison
- Cluttered displays: Use tabbed or sequential views
- Scale differences: Normalize or use relative scales
- Performance lag: Reduce model complexity for real-time comparison
Analysis Difficulties​
- No clear winner: Accept that trade-offs may be necessary
- Subjective criteria: Use structured evaluation methods
- Stakeholder disagreement: Facilitate discussion and compromise
- Analysis paralysis: Set decision deadlines and criteria
Next Steps​
- Visualization Tools: Advanced techniques for variant visualization
- Collaboration: Working with teams on variant selection
- Troubleshooting: Common issues and solutions