Skip to main content

Comparing Design Variants

InForm's comparison tools allow you to evaluate different design alternatives side-by-side, helping you make informed decisions based on multiple criteria.

Overview of Comparison Features​

Types of Comparisons​

  • Side-by-side visualization: View multiple designs simultaneously
  • Performance comparison: Compare metrics and objectives
  • Parameter comparison: Understand input differences
  • Visual differencing: Highlight geometric differences

Comparison Views​

  • Split-screen 3D: Two or more 3D models side by side
  • Tabular comparison: Structured data comparison
  • Chart overlays: Performance metrics on shared charts
  • Difference visualization: Highlight changes between variants

Setting Up Comparisons​

Selecting Variants​

From Parameter Space​

  1. Navigate to the Parameter Space view
  2. Select multiple points using Ctrl+click or box selection
  3. Right-click and choose "Compare Selected"
  4. Wait for the comparison view to load

From Saved Variants​

  1. Go to your Saved Variants library
  2. Check the boxes next to variants you want to compare
  3. Click the "Compare" button
  4. Choose your preferred comparison layout

From Project History​

  1. Access your exploration history
  2. Select configurations from different time points
  3. Use "Add to Comparison" for each variant
  4. Launch the comparison view

Comparison Setup​

Choosing Layout​

  • 2-way split: Side-by-side comparison of two variants
  • 3-way comparison: Three variants in a grid layout
  • 4+ way grid: Multiple variants in a grid arrangement
  • Sequential: Step through variants one at a time

Synchronization Options​

  • Synchronized views: Camera and zoom levels linked
  • Independent views: Each variant has its own controls
  • Synchronized parameters: Adjust all variants together
  • Fixed reference: Keep one variant as a baseline

Visual Comparison Tools​

3D Model Comparison​

Side-by-Side Viewing​

  • Synchronized rotation: Move all views together
  • Independent control: Examine each design separately
  • Zoom synchronization: Maintain consistent scale
  • Overlay transparency: See designs superimposed

Difference Visualization​

  • Geometric differencing: Highlight shape changes
  • Color coding: Show areas of difference
  • Displacement vectors: Show movement of elements
  • Before/after animation: Animated transitions between variants

Performance Visualization​

Metric Comparison Charts​

  • Bar charts: Compare scalar values across variants
  • Radar charts: Multi-dimensional performance comparison
  • Line charts: Show performance trends
  • Scatter plots: Compare two metrics simultaneously

Relative Performance​

  • Percentage differences: Show relative improvements
  • Normalized scales: Compare different units fairly
  • Baseline comparison: Show differences from a reference
  • Ranking: Order variants by performance

Analysis Techniques​

Quantitative Analysis​

Performance Metrics​

  1. Identify key metrics for your decision
  2. Set up comparison tables with all relevant values
  3. Calculate differences and percentage changes
  4. Rank variants by each metric
  5. Look for trade-offs between competing objectives

Statistical Comparison​

  • Significance testing: Determine if differences are meaningful
  • Confidence intervals: Understand uncertainty in results
  • Sensitivity analysis: How robust are the differences?
  • Monte Carlo comparison: Account for uncertainty

Qualitative Analysis​

Visual Assessment​

  1. Aesthetic evaluation: Which design looks better?
  2. Functional assessment: Does the design meet requirements?
  3. Usability considerations: How will users interact with it?
  4. Stakeholder preferences: What do decision-makers prefer?

Design Intent​

  • Alignment with goals: Which variant best meets objectives?
  • Design philosophy: Which approach is more consistent?
  • Innovation level: Which variant is more creative or novel?
  • Risk assessment: Which variant has fewer unknowns?

Advanced Comparison Features​

Multi-Criteria Decision Analysis​

Weighted Scoring​

  1. Define criteria and their relative importance
  2. Assign weights to each criterion (totaling 100%)
  3. Score each variant on each criterion
  4. Calculate weighted totals for overall ranking
  5. Perform sensitivity analysis on weights

Pareto Analysis​

  • Identify non-dominated solutions: No variant is better in all criteria
  • Visualize trade-offs: Understand what you give up and gain
  • Preference articulation: Express stakeholder priorities
  • Compromise solutions: Find balanced alternatives

Scenario Analysis​

Performance Under Different Conditions​

  1. Define scenarios: Different operating conditions or requirements
  2. Evaluate variants: How does each perform under each scenario?
  3. Robustness assessment: Which variant is most consistent?
  4. Risk evaluation: Which variant has the worst downside?

Sensitivity to Assumptions​

  • Parameter uncertainty: How sensitive are results to input assumptions?
  • Model uncertainty: How much do analysis methods matter?
  • Future conditions: How might performance change over time?

Collaboration in Comparison​

Team-Based Evaluation​

Structured Reviews​

  1. Prepare comparison materials with clear criteria
  2. Schedule review sessions with relevant stakeholders
  3. Present variants systematically using comparison tools
  4. Document feedback and preferences from each reviewer
  5. Synthesize input into recommendation

Distributed Review​

  • Share comparison links: Allow remote review
  • Collect feedback forms: Structured input from stakeholders
  • Anonymous voting: Reduce bias in evaluation
  • Comment and annotation: Allow detailed feedback

Decision Documentation​

Comparison Reports​

  • Executive summary: Key findings and recommendations
  • Detailed comparison: Full analysis with all metrics
  • Visual documentation: Screenshots and charts
  • Rationale: Why the recommended variant was chosen

Decision Tracking​

  • Decision points: Record when and how decisions were made
  • Stakeholder input: Who provided what input?
  • Criteria evolution: How did evaluation criteria change?
  • Lessons learned: What would you do differently?

Best Practices​

Effective Comparison Setup​

Choosing Variants​

  • Meaningful differences: Don't compare nearly identical variants
  • Representative range: Include diverse approaches
  • Baseline inclusion: Always include a reference design
  • Constraint satisfaction: Ensure all variants are feasible

Fair Comparison​

  • Common ground: Use same analysis methods for all variants
  • Normalized metrics: Ensure fair comparison across different scales
  • Complete data: Don't compare variants with missing information
  • Consistent assumptions: Use same boundary conditions

Analysis Best Practices​

Systematic Evaluation​

  1. Define criteria first: Know what you're evaluating before you start
  2. Use multiple perspectives: Technical, economic, aesthetic, etc.
  3. Consider all stakeholders: Who will be affected by this decision?
  4. Document assumptions: Record what you assumed and why

Avoiding Bias​

  • Blind evaluation: Hide variant names during initial assessment
  • Multiple evaluators: Get different perspectives
  • Structured methods: Use systematic approaches, not just intuition
  • Devil's advocate: Actively look for problems with preferred options

Common Comparison Scenarios​

Design Optimization​

  • Performance optimization: Which variant performs best?
  • Cost optimization: Which variant provides best value?
  • Risk minimization: Which variant has least downside?
  • Robustness: Which variant works well across conditions?

Design Evolution​

  • Incremental improvements: How much better is the new version?
  • Alternative approaches: Fundamentally different design strategies
  • Technology comparison: Different technical solutions
  • Scale comparison: How does performance scale with size?

Stakeholder Evaluation​

  • Client preferences: Which variant do clients prefer?
  • User experience: Which variant is easier to use?
  • Maintenance considerations: Which variant is easier to maintain?
  • Future adaptability: Which variant can evolve better?

Troubleshooting​

Visualization Issues​

  • Too many variants: Limit to 4-6 for effective comparison
  • Cluttered displays: Use tabbed or sequential views
  • Scale differences: Normalize or use relative scales
  • Performance lag: Reduce model complexity for real-time comparison

Analysis Difficulties​

  • No clear winner: Accept that trade-offs may be necessary
  • Subjective criteria: Use structured evaluation methods
  • Stakeholder disagreement: Facilitate discussion and compromise
  • Analysis paralysis: Set decision deadlines and criteria

Next Steps​