Don’t learn a language unless you need to use it for something.
That’s why you’re finding it hard. If you needed to program a game, decided on Unity, and had a specific thing to do, it would be easy to figure out how to do that in C#.
Don’t learn a language unless you need to use it for something.
That’s why you’re finding it hard. If you needed to program a game, decided on Unity, and had a specific thing to do, it would be easy to figure out how to do that in C#.
Honestly, the killer application is really simple, but this headset wasn’t quite designed for it (nor is MacOS in general), and that is simply as external monitors.
You know what’s annoying? Trying to use your computer outside, trying to use it on an airplane, or while travelling. Or being in an open plan office with a million visual distractions.
If you’re working in a professional setting where your company is already buying you a giant ultra wide display or multiple professional 27" screens then you’re approaching the territory of a thousand or two dollars spent on each employee, and suddenly a VR headset is starting to look more reasonable as a monitor replacement.
If this was closer to the size of the size of the Big Screen Beyond and just worked as an external display that could let you place as many windows / monitors around you as you wanted, they might actually have a compelling product.
Or if it was cheaper it could be used for gaming.
Or if it had transparent AR displays it could be used for industrial applications like Hololens.
But yeah, as is, it feels like it had a neat idea or two, some really fancy tech, and fell right in the middle of not being that useful for anyone.
And you’re someone who cares enough about privacy to subscribe to this community.
Which is why the only actual viable solution is legislation and privacy protection laws.
I’m not going to argue that Roku’s software is better, it’s definitely worse, but honestly, it’s not that much worse and doesn’t really impact day to day usage.
The voice recognition in the remote is slightly worse, the OS is less pretty and a little slower to navigate, but when 90% of its time being used is either playing something or displaying a screensaver, none of that really matters. It still opens instantly when I turn the Xbox on, it still lets me open whatever app I need and select a show, and it has one feature that Google TV doesn’t have that’s genuinely great which is private listening, where the audio will play from the Roku app on your phone so you can use headphones and not wake anyone.
Honestly, I would buy the best picture quality TV I could and not worry about Google OS or Roku OS at this point. And if you do get a Roku TV, I definitely don’t think it’s worth giving Google more money on top of that.
This sounds like someone who’s never worked on a large Python project with multiple developers. I’ve been doing this for almost two decades and we never encounter bugs because of mismatched types.
Have you worked on major projects in other languages in that time period to be able to compare and contrast?
The last two python projects I had to work on didn’t have bugs because of type issues, but it was much harder to come into the codebase and understand what was going on given that you didn’t have type information in many many places which forced you to go back and comb through the code instead.
Yeah, working on python projects professionally is always a nightmare of configuring env variables and trying to get your system to perfectly match the reference dev system. I find Node.js projects to often be the simplest and most pain free to setup, but even compiled languages like C# and Java are often easier to get up and going than python.
I don’t mean this insultingly because lots of programming jobs don’t require this and for the ones that do we still tend to all start here, but in all honesty this sounds like it’s coming from someone who’s never worked on a large project maintained by multiple people over time.
First of all, the hysteria over semicolons is ridiculous when JavaScript, Typescript, C#, Java, Go, Swift, etc. etc. wil all automatically highlight missing semicolons, if not automatically insert them for you when paired with an IDE and standard linter. On top of that, JavaScript and Typescript do not require semicolons at all, but they are still used alongside braces, because they make your code more scannable, readable, and moveable.
Secondly, without type safety your code is no longer predictable or maintainable. If you’re working to quickly sketch out some new fangled logic for a research paper it’s one thing, if you need to write test code so that your codebase can be tested an infinite number of times by other coders and whatever CI/ CD pipelines to make sure that nothing’s broken, then all of the sudden you start seeing the value in strict typing.
Honestly, complaining about type safety adding “extra code” is a complaint that almost every coder has when they start out, before you eventually realize that all that “extra code” isn’t just boiler plate for no reason but is adding specificity, predictability, reusability, and maintainability to your code base.
When defining types looked like this it was one thing:
String name = new String("Charles Xavier");
But when it looks like this, there’s no reason not to use strong typing:
const name = "Charles Xavier";
Lmao, bruh. How do people keep praising a language where messing up a space breaks everything and there is no real type system?
Java and C# are very similar, worst case scenario learn Java, then C# will be easy.