

The analog dials were an illusion. That information has been processed digitally for at least the last 25 years.
The analog dials were an illusion. That information has been processed digitally for at least the last 25 years.
What I’m saying is if YouTube is sharing $10 million of revenue with channel owners in a month that has 1,000,000,000 total views across YouTube, that’s a penny per view.
Then, if the next month the reconfigure the view counts to exclude certain bots or views under a particular number, you might see the overall view count drop from 1,000,000,000 to 500,000,000, while still hitting the same overall revenue. At that point, it’s $0.02 per view, so a channel that sees their view count drop in half may still see the same revenue despite the drop in view count.
If it’s a methodology change across all of YouTube, a channel that stays equally popular as a percentage of all views will see the revenue stay the same, even if the view counts drop (because every other channel is seeing their view counts drop, too).
Isn’t that the formula? They take all of the revenue, set aside the percentage they’ve set for revenue share, and then divide that among all channels based on viewer counts. Dropping viewership for all channels proportionally means that the same amount of revenue will still be distributed to the channels in the previous ratios.
Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Bitrate is only one variable in overall perceived quality. There are all sorts of tricks that can significantly reduce file size (and thus bitrate of a stream) without a perceptible loss of quality. And somewhat counterintuitively, the compression tricks work a lot better on higher resolution source video, which is why each quadrupling in pixels (doubling height and width) doesn’t quadruple file size.
The codec matters (h.264 vs h.265/HEVC vs VP9 vs AV1), and so do the settings actually used to encode. Netflix famously is willing to spend a lot more computational power on encoding, because they have a relatively small number of videos and many, many users watching the same videos. In contrast, YouTube and Facebook don’t even bother re-encoding into a more efficient codec like AV1 until a video gets enough views that they think they can make up the cost of additional processing with the savings of lower bandwidth.
Video encoding is a very complex topic, and simple bitrate comparisons only barely scratch the surface in perceived quality.
Article is paywalled for me.
Does it describe the methodology of how they use the transmitter and receiver?
What specifically are they transmitting? Is it actually wifi signals within the 802.11 protocols, or is “wifi” just shorthand for emitting radio waves in the same spectrum bands as wifi?
Yeah I’m with you.
“Using this technological advancement to improve health care is good”
“Not in countries where health care is publicly run”
“What” is the correct response here.
“The only difference between the two emails was the link,” the memo said. “ActBlue delivered. WinRed got flagged. That is not a coincidence.”
It could also be that winred is more often associated with spam because emails with winred links use a style more associated with other actual spam. Like if spammers use words like Trump a lot to try to scam victims, and a lot of those emails get flagged as spam, then the word Trump itself becomes more highly correlated with spam. And since the word Trump is highly associated with winred links, maybe winred gets caught up in the rule set/heuristics that associate Trump fundraisers with spam.
How does being a dick to users get back at site admins you don’t like?
That’s another round trip, and you still have to use JS to identify þe browser.
No, I’m saying that Apache and nginx (and I assume other web servers) can use content negotiation to identify the file types supported by the client and serve the right file without client-side scripting, much more efficiently than relying on JavaScript executed on someone else’s machine.
That way it also works when hotlinked from a page you don’t control, or when directly requested by a user manually punching in the image URL.
Javascript for this seems like the wrong tool. The http server itself can usually be configured to serve alternative images (including different formats) to supporting browsers, where it serves JXL if supported, falls back to webp if not, and falls back to JPEG if webp isn’t supported.
And the increased server side adoption for JXL can run up the stats to encourage the Chromium team to resume support for JXL, and encourage the Firefox team to move support out from nightly behind a flag, especially because one of the most popular competing browsers (Safari on Apple devices) does already support JXL.
It’s not too late.
The current standard on the web is JPEG for photographic images. Everyone agrees that it’s an inefficient standard in terms of quality for file size, and that its 8-bit RGB support isn’t enough for higher dynamic range or transparency. So the different stakeholders have been exploring new modern formats for different things:
WEBP is open source and royalty free, and has wide support, especially by Google (who controls a major image search engine and the dominant web browser), and is more efficient than JPEG and PNG in lossy and lossless compression. It’s 15 years old and is showing its age as we move towards cameras that capture better dynamic range than the 8-bit limits of webp (or JPEG for that matter). It’s still being updated, so things like transparency have been added (but aren’t supported by all webp software).
AVIF supports HDR and has even better file size efficiency than webp. It’s also open source and royalty free, and is maintained by the Linux Foundation (for those who prefer a format controlled by a nonprofit). It supports transparency and animation out of the box, so it doesn’t encounter the same partial support issues as webp. One drawback is that the AVIF format requires a bit more computational power to encode or decode.
HEIC is more efficient than JPEG, supports high bit depth and transparency, but is encumbered by patents so that support requires royalty payments. The only reason why it’s in the conversation is because it has extensive hardware acceleration support by virtue of its reliance on the HEVC/h.265 codec, and because it’s Apple’s default image format for new pictures taken by its iPhone/iPad cameras.
JPEG XL has the best of all possible worlds. It supports higher bit depths, transparency, animation, lossless compression. It’s open source and royalty free. And most importantly, it has a dedicated compression path for taking existing JPEG images and losslessly shrinking the file size. That’s really important for the vast majority of digitally stored images, because people tend to only have the compressed JPEG version. The actual encoding and decoding is less computationally intensive than webp or avif. It’s a robust enough standard for not just web images, but raw camera captures (potentially replacing DNG and similar formats), raw document scans and other captured imagery (replacing TIFF), and large scale printing (where TIFF is still often in the workflow).
So even as webp and avif and heic show up in more and more places, the constant push forward still allows JXL to compete on its own merits. If nothing else, JXL is the only drop in replacement where web servers can silently serve the JXL version of a file when supported, even if the “original” image uploaded to the site was in JPEG format, with basically zero drawbacks. But even on everything else, the technical advantages might support processing and workflows in JXL, from capture to processing to printing.
You were talking about $1.34 in damages, which doesn’t sound like downtime or disruption.
you will get prison for DDoS in USA
Who said anything about DDoS? I’m using ad blockers and saving/caching/archiving websites with a single computer, and not causing damage. I’m just using the website in a way the owner doesn’t like. That’s not a crime, nor should it be.
Thats a crime yeah and if Alphabet co wants to sue you for $1.34 damages then they have that right
So yeah, I stand by my statement that anyone thinks this is a crime, or should be a crime, has a poor understanding of either the technology or the law. In this case, even mentioning Alphabet suing for damages means that you don’t know the difference between criminal law and civil law.
press charges for the criminal act of intentional disruption of services
That’s not a crime, and again reveals gaps in your knowledge on this topic.
No, but it is a starting point for passing some kind of sanity check. Someone who was making $81k in 1990 was making an exceedingly high salary in the general population, and computer-related professions weren’t exactly known for high salaries until maybe the 2000’s.
[This report] (https://www.bls.gov/ocs/publications/pdf/white-collar-pay-private-goods-producing-industries-march-1990.pdf) has government statistics showing that in March 1990, entry level programmers were making on average about $27k. Senior programmers were making about $34k. Systems analysts (which I understand to have primarily been mainframe programmers in 1990) were making low 30s at the entry level and high 60s at the most senior level. Going up the management track, only the fourth and highest level was making above $80k, and it seems to me that those are going to be high level executives.
So yeah, $81k is a very senior level in the 1990s tech industry, probably significantly less common than today’s $200k tech jobs.
You have to expect that OP, who is well established in his field, to compare accordingly, not with average pay of 1990.
I’m talking about a number that is 1.4x the 95th percentile generally. It’d be weird to assume that programmers were getting paid that much more than doctors and lawyers and bankers.
According to this survey series, median IEEE members were making about $58k (which was also the average for 35-year-olds in the survey. Electrical engineering is a closely related discipline to programming.
So yeah, an $81k salary was really, really high in 1990. I suspect the original comment was thinking of the 90’s in general, and chose a salary from later in the decade while running the inflation numbers back to 1990, using the wrong conversion factor for inflation.
Edited to add: this Bureau of Labor Statistics publication summarizes salaries by several professions and experience levels as of March 1990. The most senior programmers were making around $34k, the most senior systems analysts were making about $69k, and the most senior managers, who could fairly be described as executives, were making about $88k.
I’m gonna continue to use ad blockers and yt-dlp, and if you think I’m a criminal for doing so, I’m gonna say you don’t understand either technology or criminal law.
Who is making $165k out of college?
Computer science and engineering grads at the top of their class at top schools who choose not to go to grad school. This thread claims to cite Department of Education data to show median salaries 3 years after graduation, and some of them are higher than $165k. Sure, that’s 3 years out, but it’s also median, so one would expect 75th or 90th percentile number to be higher.
Anecdotally, I know people from Stanford/MIT who did get their first jobs in the Bay Area for more than $150k more than 10 years ago, so it was definitely possible.
But this NYT article has stories about graduates from Purdue, Oregon State, and Georgetown which are good schools but also generally weren’t the schools producing many graduates landing in those $150k jobs as that very top tier. I would assume the kids graduating from Cal Tech, MIT, Stanford, and UC Berkeley are still doing well. But the middle is getting left behind.
Were people getting paid $81k in 1990? This site shows that 95th percentile in 1990 was $58k, and doesn’t have more granular data than that above the 95th percentile. So someone making $81k was definitely a 5 percenter, maybe even a 2 percenter.
The eyebrow raiser in the Slate’s base configuration is that it doesn’t come with any audio systems: no radio antenna/tuner, no speakers. It remains to be seen how upgradeable the base configuration is for audio, how involved of a task it will be to install speakers in the dash or doors, installing antennas (especially for AM, which are tricky for interference from EV systems), etc.
I’d imagine that most people would choose to spend few thousand on that audio upgrade up to the bare minimum expectations one would have for a new vehicle, so that cuts into the affordability of the package.