More likely DCS World modules will be ported over to become generic game engine assets, before DCS World itself will be rewritten.
DLSS is a smart thing, but I still consider it being MP3 of the image world. You get the lesser bandwidth and you get the artefacts. If you can live with the latter then enjoy the former. In the long term though, I guess we all rather be without the artefacts.
I wonder though, what happens if you run the game at the raw res (which DLSS is supposed to render at) and simply upsample it. How will it affect the performance and the quality? I have seen many benchmarks so far, but never seen one doing that, only DLSS on/off.
I could take a few artifacts if it doubled may framerate.
But then I grew up in a time when you played games with yellow , red and white cables hooked up to a CRT TV and you loved it.
Remember VHS tapes … how you had that degradation ?
Its amazing what we put up with just to watch a movie.
So no I don’t think it would look like this with DLSS
DCS World VHS edition (courtesy of PHOTOMOSH )
Ha ha, I do! The bootlegged copies with one-man dubbing and almost non existent colors were quite popular here in the 80s .
Just to point out DCS’s issue is the outdated software. Now perhaps NVIDIA can make a driver hack much like there FFR for DLSS to work sort of on these outdated software engines. However iirc DCS is not friendly with Nvidia’s driver level FFR so likely to be just as hit and miss.
Awesome. Even ‘DCS World VHS Edition’ looks like a real simulator.
You point that out over and over again. Who cares. It is what we have now, and what we are buying hardware for, now.
NVIDIA holding back is not helpful. I would pay a premium to get rid of Smart Smoothing in DCS World, but I can’t.
Unfortunately it will be awhile before Linux still gets the recognition for what it is. MS did purchase from what I recall SUSE which at tye time did cause some concerns for Linux and possibly in part could have possibly caused MacOS issues. However these concerns have mostly faded.
A good thing to watch is the fallout between Bill and Jobs. Also it would have been interesting if Bill had stayed on as part of the OS2 team instead of parting from there and releasing W95 (aka MSDOS 7 Autoloading Windows v4.x).
Remember the question
“Do you plan on continuing to Develop Windows?”
Bill “No then you would just have OS2.”
Windows has a strong hold on tye desktop market. Linux’s Diversity also works partly against gaining. Also that OEMs include majority of the time what consumers mistaking see as a free copy of Windows OS; when in reality it is included as part of the pc cost. Dell for example dropped iirc $100 off if you opted for the pre installed Ubuntu.
The other issue is recognition when the Linux identity is hidden in the about or licencing where a lot don’t take the time to read through.
Which that fortune your talking about could be spent on an updated engine that removes the need for Smart smoothing, makes it work with FFR and DLSS and a slew of in the now Available hardware.
Do you think Unreal and Unity with other modern forward thing game DeVs blame new hardware? No they evolve there code base with new improved versions that support and utilize new Hardware features.
Linux’s Diversity also works partly against gaining.
Not as much of a problem anymore. AppImages and similar are becoming more common.
I think you do not realize how complex DCS World aircraft modules are. From what I have seen, the LUA scripting used to replicate the relevant avionics and mechanical systems is not particularly elegant itself.
We are probably looking a full redevelopment of DCS World itself and all aircraft modules as being easier than simply ‘porting’ this code to a new engine.
Then it is an issue until they can update there code to support modern features. Something that should have been started long ago. Scripts for how a plane operates shouldn’t affect how something is rendered.
You can’t blame hardware for new features not being supported on old software.
Do you know how many kinds of games are made with Unreal and Unity?
Appimages are nothing all that new. I had a variation on Windows as well.
There is also another form that runs images in it’s own Sandbox.
And yes I have had difficulty getting some appimages to run on some distros even using the commandline to attempt to get it working. I had to execute the appimage from commandline everytime. Which is not user friendly for masses. Those particular images had to direct install on those distros. Which was disappointing as it should have just worked.
That being said Appimages are a positive direction though not good with saving on space but good with potentially keeping programs compatible.
That is not what I was suggesting.
To have a clue as to how complex porting this kind of software is, you need to really know how much something like Marlin 3D printer firmware has incrementally improved over so many years. Tons of clever hacks, several major rewrites, and we are still having problems with the firmware not doing things smoothly, even on the best hardware.
Last time I checked, the DCS World LUA scripts I could see for such things as engine simulation, mechanical failure simulation, and avionics/display simulation, were much less elegant than what I see in 3D printer firmware.
AppImages
Works for me.
As mentioned is not consistent on all Distros. In my case it was between 2 Manjaro versions. 1 works as expected and one is a pain. Which in itself is strange.
With DCS you’ll just have to accept it’s limitations. As for mgpu maybe DCS can support non SLI based mgpu support. Otherwise grab 4 2080ti and scale up where it last had support(sli). As the new gpu line features are not supported anyway.
Exactly the same could be said of NVIDIA hardware, for technologically much more trivial reasons. The obvious explanation is market politics.
That’s my point, and I think the entire point of this thread as well.
I am skeptical about it for full frames, but really curious about a foverated rendering version, that does the fovea part at full resolution every frame (EDIT: fully rendered at that full resolution, that is - no DLSS there), and retains and incorporates its history of those full resolution patches in the DLSS for the extra-fovea area.
The problem as we know SLI and CF never took off as anticipated by both teams. So unfortunately Sli is more of a niche group of users even with Nvidia releasing there funhouse demo to try and push this feature.
If as you said Nvidia was not holding this feature back by things like restricting Sli to top level cards only it might have more users utilizing it.
Unfortunately as we both know unfortunate as it is Sli users are a small niche group due to limited programs that support it and the issues of the past with MultiGpu techniques used. Maybe if StarVR One had had a real launch and oimax also supporting alternate gpu rendering per eye. We might have seen Nvidia keep this feature in consumer targeted cards.
DCS unfortunately hadn’t made plans to support upcoming features. So like your example of tge 3dprinter firmware patch/hacking can only push DCS so far. Blunt force to get performance this way is crazy. Like having to buy server level processors to get the needed Single core performance you looking for. When both Intel and Amd could release say an i5/r5 that has the needed single core performance one is looking for DCS.
The thread if going by the op was about Nvidia banning reviewers for focusing on Rasterization performance.
You seem to be suggesting that lack of demand for SLi specifically, is an ok reason not to have any doubling of performance this generation, whether by SLi, or multiple chips on a single card with silicon interposer…
Indeed.
Because the evidence suggests NVIDIA held back performance, deliberately, and tried to pass it off. Basically hoping we were all stupid enough to ignore the reality of essential performance in favor of optional features.
And this is really just further evidence after NVIDIA claimed ‘double’ shader core count.
Whose to say they won’t release another x2 card? They just haven’t done so at present to increase sales of there current Skus.
Not that uncommon. Imagine if we were advancing with Risc processors vs Cisc.

And this is really just further evidence after NVIDIA claimed ‘double’ shader core count.
It might be a gimmick method(we know gimmicks are used to push sales ie "Best theoretical performance if all components are running at top spec). Now if you want the conspiracy potential. How about tye Driver update that 1080ti users noticed performance drop vs previous driver. Suggesting that they did so to push sales of 2000 series cards.
Simply put the issue is on a bit of shared both. Nvidia limiting possible gains by using silicone for other features and software devs that have failed to adapt to get gains new hardware architecture offers.
With the programs and benches showing improvements vs those which don’t(usually due to limitations of utilization).

They just haven’t done so at present to increase sales of there current Skus.
Exactly. That is likely to be disproportionately disadvantageous to us, and for lack of transparency, we don’t know the extent to which this was unavoidable or greedy.

Now if you want the conspiracy potential.
No, I don’t. I think it was just despicable and most probably irrational on NVIDIA’s part. I am not suggesting or alleging anything worse than that.

Exactly. That is likely to be disproportionately disadvantageous to us, and for lack of transparency, we don’t know the extent to which this was unavoidable or greedy.
Which at one point you waited for there so to speak flagship cards. Now you kind of don’t. Except with the 2000 aeries card not long after release. New “super versions” were released except for the 2080ti if not mistaken.
Many pimax users felt the same with the 8k+ Release coming less than a year later and how quickly the og 8k faded prior as a sku.
Guess everyone could be more open in marketing. However business is more about making money and relying on early jumpers.