I don’t understand why developers and publishers aren’t prioritizing spectacle games with simple graphics like TABS, mount and blade, or similar. Use modern processing power to just throw tons of shit on screen, make it totally chaotic and confusing. Huge battles are super entertaining.
There’s no better generational leap than Monster Hunter Wilds, which looks like a PS2 game on its lowest settings and still chugs at 24fps on my PC.
I don’t mind the graphics that much, what really pisses me off is the lack of optimization and heavy reliance on frame gen.
To be fair there isn’t just graphics.
Something like Zelda Twilight princess HHD to Zelda Breath of the wild was a huge leap in just gameplay. (And also in graphics but that’s not my point)
I feel like we won’t be able to see the difference until a couple of years, like CGI in old movies.
I would argue that late SNES era games look far better than their early 3d era follow ups
They said we’d never have consumer tech that could white clip in real time but look at us now.
Games did teach me about diminishing returns though
This is what a remaster used to look like.
It was a remake not a remaster. The hit boxes weren’t the same.
The difference is academic and doesn’t affect my point.
Pretty sick if you ask me
I agree whole heartedly
deleted by creator
I mean, how much more photorealistic can you get? Regardless, the same game would look very different in 4K (real, not what consoles do) vs 1080p.
The lighting in that image is far, far from photorealistic. Light transport is hard.
That’s true but realistic lightning still wouldn’t make anywhere near the same amount of difference that the other example shows.
Let’s compare two completely separate games to a game and a remaster.
Generational leaps then:
Good lord.
EDIT: That isn’t even the Zero Dawn remaster. That is literally two still-image screenshots of Forbidden West on both platforms.
Good. Lord.
What game is the first one
Yeah no. You went from console to portable.
We’ve had absolutely huge leaps in graphical ability. Denying that we’re getting diminishing returns now is just ridiculous.
We’re still getting huge leaps. It simply doesn’t translate into massively improved graphics. What those leaps do result in, however, is major performance gains.
I have played Horizon Zero Dawn, its remaster, and Forbidden West. I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn. The differences are absolutely there, it’s just not as spectacular as the jump from 2D to 3D.
The post comes off like a criticism of hardware not getting better enough faster enough. Wait until we can create dirt, sand, water or snow simulations in real time, instead of having to fake the look of physics. Imagine real simulations of wind and heat.
And then there’s gaussian splatting, which absolutely is a huge leap. Forget trees practically being arrangements of PNGs–what if each and every leaf and branch had volume? What if leaves actually fell off?
Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.
Combined with better and better storage and VR/AR, there is still plenty of room for tech to grow. Saying “diminishing returns” is like saying that fire burns you when you touch it.
I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.
Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.
Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?
And BG3 has notoriously low minimums, it is the exception, not the standard.
If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.
The fact that the Game Boy Advance looks that much better than the Super Nintendo despite being a handheld, battery powered device is insane
Is it that much better? The colours just look more saturated to me
Because most GBA games were meant to be desaturated due to the terrible screen
There’s noticably more detail, especially along the coastline. Also, the more saturated colors improve contrast
The GBA just has reworked art. The SNES could easily do the same thing.
It is baffling to me that people hate cross gen games so much. Like, how awful for PS4 owners that don’t have to buy a new console to enjoy the game, and how awful for PS5 owners that the game runs at the same fidelity at over 60FPS, or significantly higher fidelity at the same frame rate.
They should have made the PS4 version the only one. Better yet, we should never make consoles again because they can’t make you comprehend four dimensions to be new enough.
The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.
It’s a pointless point. Complain about power draw. Push ARM.
ARM isn’t going to magically make GPUs need less brute force energy in badly optimized games.
The question is whether “realism” was ever a good target. The best games are not the most realistic ones.
We should be looking at more particles, more dynamic lighting, effects, realism is forsure a goal just not in the way you think, pixar movies have realistic lighting and shadows but arent “realistic”
After I started messing with cycles on blender I went back to wanting more “realistic” graphics, its better for stylized games too
But yeah I want the focus to shift towards procedural generation (I like how houdini and unreal approach it right now), more physics based interactions, elemental interactions, realtime fire, smoke, fluid, etc. Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.
So many retro games are replayable and fun to this day, but I struggle to return to games whose art style relied on being “cutting edge realistic” 20 years ago.
I dunno, Crysis looks pretty great on modern hardware and its 18 years old.
Also, CRYSIS IS 18 WHERE DID THE TIME GO?
Yeah, but it was about 15 years ahead of it’s time.
There’s a joke in there somewhere about Crysis being the age of consent but I just can’t land it right now.
Probably because I’m old enough to remember it’s release.
Really? Cause I don’t know, I can play Shadow of the Colossus, Resident Evil 4, Metal Gear Solid 3, Ninja Gaiden Black, God of War, Burnout Revenge and GTA San Andreas just fine.
And yes, those are all 20 years ago. You are now dead and I made it happen.
As a side note, man, 2005 was a YEAR in gaming. That list gives 1998 a run for its money.
I would say GoW and SotC at least take realism as inspiration, but aren’t realistic. They’re like an idealized version of realism. They’re detailed, but they’re absolutely stylized. SotC landscapes, for example, look more like paintings you’d see rather than places you’d see in real life.
Realism is a bad goal because you end up making every game look the same. Taking our world as inspiration is fine, but it should almost always be expanded on. Know what your game is and make the art style enhance it. Don’t just replicate realism because that’s “what you’re supposed to do.”
Look, don’t take it personally, but I disagree as hard as humanly possible.
Claiming that realism “makes every game look the same” is a shocking statement, and I don’t think you mean it like it sounds. That’s like saying that every movie looks the same because they all use photographing people as a core technique.
If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?
At any rate, the idea that taking photorealism as a target means you give up on aesthetics or artistic intent is baffling. That’s not even a little bit how it works.
On the other point, I think you’re blending technical limitations with intent in ways that are a bit fallacious. SotC is stylized, for sure, in that… well, there are kaijus running around and you sometimes get teleported by black tendrils back to your sleeping beauty girlfirend.
But is it aiming at photorealism? Hell yeah. That approach to faking dynamic range, the deliberate crushing of exteriors from interiors, the way the sky gets treated, the outright visible air adding distance and scale when you look at the colossi from a distance, the desaturated take on natural spaces… That game is meant to look like it was shot by a camera all the way. They worked SO hard to make a PS2 look like it has aperture and grain and a piece of celluloid capturing light. Harder than the newer remake, arguably.
Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.
I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.
If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?
The former is more realistic, but not for that reason. The lighting techniques are techniques, not a style. Realism is trying to recreate the look of the real world. Pixar is not doing that. They’re using advanced lighting techniques to enhance their stylized worlds.
Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.
Being inspired by film is not the same as trying to replicate the real world. (I’d argue it’s antithetical to it to an extent.) Usually film is trying to be more than realistic. Sure, it’s taking images from the real world, but they use lighting, perspective, and all kinds of other tools to enhance the film. They don’t just put some actors in place in the real environment and film it without thought. There’s intent behind everything shown.
I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.
Cyberpunk looks more like Indiana Jones than Persona 5. Sure, they stand out from each other, but it’s mostly due to environments.
I think there’s plenty of games that benefit from realism, but not all of them do. There are many games that could do better with stylized graphics instead. For example, Cyberpunk is represented incredibly well in both the game and the anime. They both have different things they do better, and the anime’s style is an advantage for the show at least. The graphics style should be chosen to enhance the game. It shouldn’t just be realistic because it can be. If realism is the goal, fine. If it’s supposed to be more (or different) than realism, maybe try a different style that improves the game.
Realism is incredibly hard to create assets for, so it costs more money, and usually takes more system resources. For the games that are improved by it, that’s fine. There’s a lot of games that could be made on a smaller budget, faster, run better, and look more visually interesting if they chose a different style though. I think it should be a consideration that developers are allowed to make, but most are just told to do realism because it’s the “premium” style. They aren’t allowed to do things that are better suited for their game. I think this is bad, and also leads to a lack in diversity of styles.
I don’t understand what you’re saying. Or, I do, but if I do, then you don’t.
I think you’re mixing up technique with style, in fact. And really confusing a rendering technique with an aesthetic. But beyond that, you’re ignoring so many games. So many. Just last year, how do you look at Balatro and Penny’s Big Breakaway and Indiana Jones and go “ah, yes, games all look the same now”. The list of GOTY nominees in the TGAs was Astro Bot, Balatro, Wukong, Metaphor, Elden Ring and Final Fantasy VII R. How do you look at that list of games and go “ah, yes, same old, same old”.
Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming. Because man, there’s so much stuff and it goes from grungy, chunky pixel art to lofi PS1-era jank to pitch-perfect anime cel shading to naturalistic light simulation. If you’re out there thinking games look samey you have more of a need to switch genres than devs to switch approach, I think.
By “all games look the same” I’m being hyperbolic. I mean nearly all AAA games and the majority of AA games (and not an insignificant number of indies even).
Watch this video. Maybe it’ll help you understand what I’m saying.
Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming.
Lol. No. Again, I was being hyperbolic and talking mostly about the AAA and AA space. I personally almost exclusively play indies who know what they’re trying to make and use a style appropriate to it. I play probably too many games. I also occasionally make games myself, I was the officer in a game development club in college, and I have friends in the industry. I’m not just some person who doesn’t understand video games.
Did those go for realism though, or were they just good at balancing the more detailed art design with the gameplay?
Absolutely they went for realism. That was the absolute peak of graphics tech in 2004, are you kidding me? I gawked at the fur in Shadow of the Colossus, GTA was insane for detail and size for an open world at the time. Resi 4 was one of the best looking games that gen and when the 360 came out later that year it absolutely was the “last gen still looked good” game people pointed at.
I only went for that year because I wanted the round number, but before that Silent Hill 2 came out in 2001 and that was such a ridiculous step up in lighting tech I didn’t believe it was real time when the first screenshots came out. It still looks great, it still plays… well, like Silent Hill, and it’s still a fantastic game I can get back into, even with the modern remake in place.
This isn’t a zero sum game. You don’t trade gameplay or artistry for rendering features or photorealism. Those happen in parallel.
They clearly balanced the more detailed art design with the game play.
GTA didn’t have detail on cars to the level of a racing game, and didn’t have characters with as much detail as Resident Evil, so that it could have a larger world for example. Colossus had fewer objects on screen so it could put more detail on what was there.
Yeah. So like every other game.
Nothing was going harder for visuals, so by default that’s what was happening. They were pushing visuals as hard as they would go with the tech that they had.
The big change isn’t that they balanced visuals and gameplay. If anything the big change is that visuals were capped by performance rather than budget (well, short of offline CG cutscenes and VO, I suppose).
If anything they were pushing visuals harder than now. There is no way you’d see a pixel art deck building game on GOTY lists in 2005, it was all AAA as far as the eye could see. We pay less attention to technological escalation now, by some margin.
Yeah. So like every other game.
Except for the ones that don’t do a good job of balancing the two things. Like the games that have incredible detail but shit performance and/or awful gameplay.
STALKER is good, though I played a lot of Anomaly mostly, and I’m not sure that STALKER was ever known for bleeding edge graphics
Stalker gamma is free if anyone wanted to try it out. I ended up buying the OG games cause I liked it so much.
The 2nd one is good, but I would advise people to wait until they implement more promised features before they buy it.
I just finished STALKER 2. It’s a fucking mess and was unplayably broken for half a month at one point for me, and I fucking love it. It took me 80 hours of mostly focusing on advancing the story to reach the end, and I feel like I only saw maybe 30% of what’s out there. I can already tell that this is going to be my new Skyrim, tooling around with 500 hours in the game and still finding new situations. I’m SO FUCKING PUMPED for anomaly 2-- a lot of the same modders that worked on anomaly are already putting out modpacks for Stalker 2.
Factorio and Balatro
A Link to the Past > Ocarina of Time
Fight me
I’ve been playing the zelda games in order since the new one was announced on the switch and I’m stuck on OoT (zelda 2 was a pain as well).
I don’t have much free time.
Idk, I’d say that pursuing realism is worthy, but you get diminishing returns pretty quick when all the advances are strictly in one (or I guess two, with audio) sense. Graphical improvements massively improved the experience of the game moving from NES or Gameboy to SNES and again to PS1 and N64. I’d say that the most impressive leap, imo, was PS1/N64 to PS2/XBox/GameCube. After that, I’d say we got 3/4 of the return from improvements to the PS3 generation, 1/2 the improvement to PS4 gen, 1/5 the improvement to PS5, and 1/8 the improvement when we move on to PS5 Pro. I’d guess if you plotted out the value add, with the perceived value on the Y and the time series or compute ability or texture density or whatever on the x, it’d probably look a bit like a square root curve.
I do think that there’s an (understandably, don’t get me wrong) untapped frontier in gaming realism in that games don’t really engage your sense of touch or any of the subsets thereof. The first step in this direction is probably vibrating controllers, and I find that it definitely does make the game feel more immersive. Likewise, few games engage your proprioception (that is, your knowledge of your body position in space), though there’ve been attempts to engage it via the Switch, Wii, and VR. There’s, of course, enormous technical barriers, but I think there’s very clearly a good reason why a brain interface is sort of thought of as the holy grail of gaming.
Having a direct brain interface game, that’s realistic enough to overcome the Uncanny Valley, would destroy peoples lives. People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one. Shit, give me a universe wherein I can double-jump, fly, or communicate with animals, and I’d have a hard time returning to this version.
We could probably get close with a haptic feedback suit, a mechanism that allows you to run/jump in any direction, and a VR headset, but there would always be something tethering you to reality. But a direct brain to machine interaction would have none of that, it would essentially be hijacking our own electrical neural network to run simulations. Much like Humans trying to play Doom on literally everything. It would be as amazing as it was destructive, finally realizing the warnings from so many parents before its time: “that thing’ll fry your brain.”
Tbf, it’s kinda bullshit that we can’t double jump IRL. Double jumping just feels right, like it’s something we should be able to do.
Yeah, no, it’d likely be really awful for us. I mean, can you imagine what porn would be like on that? That’s a fermi paradox solution right there. I could see the tech having a lot of really great applications, too, like training simulations for example, but the video game use case is simultaneously exhilarating and terrifying.
People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one.
Have you considered making the real world better?
Nah, that would cut into profits.
Like cgi and other visual effects, realism has some applications that can massively improve the experience in some games. Just like how lighting has a massive impact, or sound design, etc.
Chasing it at the expense of game play or art design is a negative though.
I agree generally, but I have to offer a counterpoint with Kingdom Come: Deliverance. I only just got back into it after bouncing off in 2019, and I wish I hadn’t stopped playing. I have a decent-ish PC and it still blows my entire mind when I go roaming around the countryside.
Like Picard said above, in due time this too will look aged, but even 7 years on, it looks and plays incredible even at less-than-highest settings. IMHO the most visually impressive game ever created (disclaimer: I haven’t seen or played Horizon). Can’t wait to play KC:D 2!
not really. plenty of great games have visual fidelity as a big help in making it good.
i dont think rdr2 would be such a beautiful immersive experience if it had crappy graphics.
Visual fidelity isn’t the same as realism. RDR2 is trying to replicate a real experience, so I mostly agree with you. However, it does step away from realism sometimes to create something more.
Take a look at impressionist art, for example. It starts at realism, but it isn’t realistic. It has more style to it that enhances what the artist saw (or wanted to highlight).
A game should focus on the experience it’s tying to create, and it’s art style should enhance that experience. It shouldn’t just be realistic because that’s the “premium” style.
For an example, Mirror’s Edge has a high amount of fidelity (for its time), but it’s highly stylized in order to create the experience they wanted out of it. The game would be far worse if they tried to make the graphics realistic. This is true for most games, though some do try to simulate being a part of this world, and it’s fine for them to try to replicate it because it suits what their game is.
Couldn’t disagree more. Immersion comes from the details, not the fidelity. I was told to expect this incredibly immersive experience form RDR2 and then I got:
- carving up animals is frequently wonky
- gun cleaning is just autopilot wiping the exterior of a gun
- shaving might as well be done off-screen
- you transport things on your horse without tying them down
Yeah that didn’t do it for me.
realism and visual fidelity are two slightly overlapping but different things.
a game can have great graphics but its npcs be unrealistic bullet sponges. cp2077 comes to mind, not that this makes it a bad game necessarily.
i dont actually want to go to the bathroom in-game but i love me some well written story, graphics can help immensely with that. among other things.
come to think of it 100% realist games would probably be boring
I had way more fun in GTA 3 than GTA 5. RDR2 isn’t a success because the horse has realistic balls.
To put another nail in the coffin, ARMA’s latest incarnation isn’t the most realistic shooter ever made. No amount of wavy grass and moon phases can beat realistic weapon handling in the fps sim space. (And no ARMA’s weapon handling is not realistic, it’s what a bunch of keyboard warriors decided was realistic because it made them feel superior.) Hilariously the most realistic shooter was a recruiting game made by the US Army with half the graphics.
realism and visual fidelity are not the same thing.
BUT, visual fidelity adds a LOT to the great writing in rdr2.
Yeah but you said it was a pre-requisite and that’s just false.
you are right i didnt notice i had worded it that way and its not what i meant
I see, and yeah graphics can help a lot. But how much do we actually need? At what point is the gain not enough to justify forcing everyone to buy another generation of GPUs?
i think as it advances the old ones will inevitably look dated, dont think there will be a limit short of photorealism, its just slowed down a bunch now. imagine if we had a game like rdr but actually photorealistic. shit with vr you imagine any photorealistic and immersive world, that would be so cool.
sadly, the profit motive makes it difficult for a given studio to want to optimize their games making them heavier and heavier, and gpus turned out to be super profitable for AI making them more and more expensive. i think things will definetly stagnate for a bit but not before they find a way to put that ray tracing hardware we have now to good use, so well see about that.
It’s the right choice for some games and not for others. Just like cinematography, there’s different styles and creators need to pick which works best for what they’re trying to convey. Would HZD look better styled like Hi-Fi Rush? I don’t really think so. GOW? That one I could definitely see working more stylized.
This is true of literally any technology. There are so many things that can be improved in the early stages that progress seems very fast. Over time, the industry finds most of the optimal ways of doing things and starts hitting diminishing returns on research & development.
The only way to break out of this cycle is to discover a paradigm shift that changes the overall structure of the industry and forces a rethinking of existing solutions.
The automobile is a very mature technology and is thus a great example of these trends. Cars have achieved optimal design and slowed to incremental progress multiple times, only to have the cycle broken by paradigm shifts. The most recent one is electrification.
Okay then why are they arbitrarily requiring new GPUs? It’s not just about the diminishing returns of “next gen graphics”.
If you think about it, the gaming GPUs have been in a state of crisis for over half a decade. First shortages because everybody used them to mine bitcoins, then the covid chip shortages happened and now AI is killing cheaper GPUs. Therefore many people are stuck with older hardware, SteamDecks, consoles and haven’t upgrades their systems and those highly flammable $1000+ GPUs will not lead to everyone upgrading their PCs. So games are using older GPUs as target
That’s exactly why. Diminishing returns means exponentially more processing power for minimal visual improvement.
I think my real question is what point do we stop trying until researchers make another breakthrough?
Researchers can’t make a breakthrough if they don’t try ^^
AAA game designers don’t need to be the researchers.
That’s what game engines are for
Great, let the game engine people go wild. We don’t need to try and build the next Far Cry with all of their beta tech though.
path tracing is a paradigm shift, a completely different way of showing a scene to that normally done, it’s just a slow and expensive one (that has existed for many years but only started to become possible in real time recently due to advancing gpu hardware)
Yes, usually the improvement is minimal. That is because games are designed around rasterization and have path tracing as an afterthought. The quality of path tracing still isn’t great because a bunch of tricks are currently needed to make it run faster.
You could say the same about EVs actually, they have existed since like the 1920s but only are becoming useful for actual driving because of advancing battery technology.
Then let the tech mature more so it’s actually analogous with modern EVs and not EVs 30 years ago.
Yea, it’s doing that. RT is getting cheaper, and PT is not really used outside of things like cyberpunk “rt overdrive” which are basically just for show.
Except it’s being forced on us and we have to buy more and more powerful GPUs just to handle the minimums. And the new stuff isn’t stable anyways. So we get the ability to see the peach fuzz on a character’s face if we have a water-cooled $5,000 spaceship. But the guy rocking solid GPU tech from 2 years ago has to deal with stuttering and crashes.
This is insane, and we shouldn’t be buying into this.
It’s not really about detail, it’s about basic lighting especially in dynamic situations
(Sometimes it is used to provide more detail in shadows I guess, but that is also usually a pretty big visual improvement)
I think there’s currently a single popular game where rt is required? And I honestly doubt a card old enough to not support ray tracing would be fast enough for any alternate minimum setting it would have had instead. Maybe the people with 1080 ti-s are missing out, but there’s not that many of them honestly. I haven’t played that game and don’t know all that much about it, it might be a pointless requirement for all I know.
Nowadays budget cards support rt, even integrated gpus do (at probably unusable levels of speed, but still)
I don’t think every game needs rt or that rt should be required, but it’s currently the only way to get the best graphics, and it has the potential to completely change what is possible with the visual style of games in the future.
Edit: also the vast majority of new solid gpus started supporting rt 6 years ago, with the 20 series from nvidia
That’s my point though, the minimums are jacked up well beyond where they need to be in order to cram new tech in and get 1 percent better graphics even without RT. There’s not been any significant upgrade to graphics in the last 5 years, but try playing a 2025 AAA with a 2020 graphics card. It might work, but it’s certainly not supported and some games are actually locking out old GPUs.
Ironically, Zelda Link to the Past ran at 60fps, and Ocarina of Time ran at 20fps.
The same framerates are probably in the Horizon pictures below lol.
Now, Ocarina of Time had to run at 20fps because it had one of the biggest draw distances of any N64 game at the time. This was so the player could see to the other end of Hyrule Field, or other large spaces. They had to sacrifice framerate, but for the time it was totally worth the sacrifice.
Modern games sacrifice performance for an improvement so tiny that most people would not be able to tell unless they are sitting 2 feet from a large 4k screen.
Had to, as in “they didn’t have enough experience to optimize the games”. Same for Super Mario 64. Some programmers decompiled the code and made it run like a dream on original hardware.
The programming knowledge did not exist at the time. Its not that they did not have the experience, it was impossible for them to have the knowledge because it did not exist at the time. You can’t really count that against them.
Kaze optimizing Mario 64 is amazing, but it would have been impossible for Nintendo to have programmed the game like that because Kaze is able to use programming technique and knowledge that literally did not exist at the time the N64 was new. Its like saying that the NASA engineers that designed the Atlas LV-3B spacecraft were bad engineers or incapable of making a good rocket design just because of what NASA engineers could design today with the knowledge that did not exist in the 50s.
One of the reasons I skipped the other consoles but got a GameCube was because all the first party stuff was buttery smooth. Meanwhile trying to play shit like MechAssault on Xbox was painful.
I never had trouble with MechAssault, because the fun far outweighed infrequent performance drops.
I am a big proponent of 60fps minimum, but I make an exception for consoles from the 5th and 6th generations. The amount of technical leap and improvement, both in graphics technology and in gameplay innovation, far outweighs any performance dips as a cost of such improvement. 7th generation is on a game by game basis, and personally 8th generation (Xbox One, Switch, and PS4) is where it became completely unacceptable to run even just a single frame below 60fps. There is no reason that target could not have been met by then, definitely now. Switch was especially disappointing with this, since Nintendo made basically a 2015 mid-range smartphone but then they tried to make games for a real game console, with performance massively suffering as a result. 11fps, docked, in Breath of the Wild’s Korok Forest or Age of Calamity (anyehwere in the game, take your pick,) is totally unacceptable, even if it only happened one time ever rather than consistently.
when i was a smol i thought i needed to buy the memory expansion pack whenever OoT fps tanked.
Kind of like smartphones. They all kind of blew up into this rectangular slab, and…
Nothing. It’s all the same shit. I’m using a OnePlus 6T from 2018, and I think I’ll have it easily for another 3 years. Things eventually just stagnate.
You can easily keep a phone for 7 years.
One company put a stupid fucking notch in their screen and everyone bought that phone, so now every company has to put a stupid fucking notch in the screen
I just got my tax refund. If someone can show me a modern phone with a 9:16 aspect ratio and no notch, I will buy it right now
I was hoping that eventually smartphones would evolve to do everything. Especially when things like Samsung Dex were intorduced, it looked to me like maybe in the future phones could replace desktops, running a full desktop OS when docked and some simplified mobile UI + power saving when in mobile mode.
But no, I only have a locked-down computer.
there is an official android desktop mode, I tried it and it isn’t great ofc but my phone manufacturer (oneplus) has clearly put no work into making it functional
Yeah whatever happened to that? That was such a good idea and could have been absolutely game changing if it was actually marketed to the people who would benefit the most from it
I used it for a while when I worked two jobs. Is clock out of job 1 and had an agreement with them to be allowed to use the screen and input devices at my desk for job 2. Then I’d plug in my Tab S8 and get to work, instead of having to carry to chunky laptops.
So it still exists! What I noticed is that a Snapdragon 8 Gen 1 feels underpowered and that Android, and this is the bigger issue, does not have a single browser that works as a full fledged desktop version. All browser I tested has some shortcomings, especially with drag and drop or context menus or whatever. Like things work but you’re constantly reminded that you’re running a mobile os. Like weird behavior or oversized context menus or whatever.I wish you could lunch into a Linux vm instead of Dex UI. Or for Samsung to double down on the concept. The Motorola Atrix was so ahead of it’s time. Like your phone transforming into your tablet, into your laptop, into your desktop. How fucking cool is that?
Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do somethingcoolto threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?It’s super easy to forget but Ubuntu tried to do it back in the day with Convergence as well, and amusingly this article also compares it to Microsoft’s solution on Windows Phone. It’s a brilliant idea but apparently no corporation with the ecosystem to make it actually happen has the will to risk actually changing the world despite every company talking about wanting an “iPhone moment”
Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?
Let’s be real, Apple’s biggest risk would be losing the entire student and young professional market by actually demonstrating that they don’t need a Mac Book Pro to use the same 5 webapps that would work just as well on a decent Chromebook (if such a thing existed)
Linux vm
Or just something like Termux, a terminal emulator for Android. Example screenshot (XFCE desktop over VNC server), I didn’t know what to fit in there:
Full desktop apps, running natively under Android. For better compatibility Termux also has proot-distro (similar to chroot) where you can have… let me copy-paste
Supported distributions (format: name < alias >): * Alpine Linux < alpine > * Arch Linux < archlinux > * Artix Linux < artix > * Chimera Linux < chimera > * Debian (bookworm) < debian > * deepin < deepin > * Fedora < fedora > * Manjaro < manjaro > * openKylin < openkylin > * OpenSUSE < opensuse > * Pardus < pardus > * Ubuntu (24.04) < ubuntu > * Void Linux < void > Install selected one with: proot-distro install <alias>
Though there is apparently some performance hit. I just prefer Android, but maybe you could run even full LibreOffice under some distro this way.
If it can be done by Termux, then someone like Samsung could definitely make something like that too, but integrated with the system and with more software available in their repos.
What’s missing from the picture but is interesting too is NGINX server (reverse proxy, lazy file sharing, wget mirrored static website serving), kiwix-serve (serving ZIM files including the entire Wikipedia from SD card) and Navidrome (music server).
And brought to any internet-connected computer via Cloudflare QuickTunnel (because it doesn’t need account nor domain name). The mobile data upload speed will finally matter, a lot.You get the idea, GNU+Linux. And Android already has the Linux kernel part.
Yeah, I remember trying it and while it works the performance hit was too big for my use case. But it’s been a while!
Fortunately I’m in a position where I don’t have to juggle two jobs anymore so I barely use Dex these days.
Which in reverse is also why Samsung isn’t investing a lot into it I suppose - it’s a niche use case. I would guess that generally people with a desktop setup would want something with more performance than a mobile chip.
I miss physical keyboards on phones
Maybe make the rectangular slab smaller again?
I would love to have a smaller phone. Not thinner, smaller. I don’t care if it’s a bit thick, but I do care if the screen is so big I can’t reach across it with one hand.
What do you expect next? Folding phones? That would be silly!