I think we need to make the list of testing right now

I found that there are a lot of issue with different case, someone found, someone don’t found.
Some tester tell that the result is better, some tell that it is same.

So I think we have to make the list of testing and compare.

  1. Game

  2. Pitool Version

  3. SteamVr version (normal, beta).

  4. Pitool setting
    4.1 render quality

  5. SteamVr setting.
    5.1 sampling
    5.2 reprojection

  6. Headset

  7. Hardware, cpu , gpu other.

  8. Result & issue.
    8.1 fps
    8.2 frame drop
    8.3 quality
    8.4 opinion
    8.5 issue

If we don’t make these standard of testing, we will never know what is exactly problem.
May you don’t have to record video for all test, but we need to compare the result between each tester in the same environment.

We don’t need too much game to test but you should specific the same game to test before and compare them.

May using trello and set to be public, one tile for one game and you comment the result there. So we can read the testing result.

@deletedpimaxrep1 @VoodooDE @mixedrealityTV @SweViver

2 Likes

You can only compare tests/review by same tester. For a better comparison and understanding of what they did they also would have to document the testing tools/method, i.e. what they used instead of this

2 Likes

I am afraid that we don’t have enough time right now, people have to make decision so I don’t want to make it complicated.

May we list only 5-10 game and see that the result is in same direction.

Last video of Sebastian talking about distortion why we suspect that it is distortion or other thing such as reprojection. So may we compare by other tester and try to check that why only someone can found it and what that issue should be called in technical.

I think that after each people get headset, they will suspect about game setting for better quality, then we should have setting and result list too.