New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Intel 13th gen is now the new king (performance / value)?
Been seeing different source of news that are publishing the result benchmarks of the new Intel 13th gen CPU's.
TLDR: The i9-13900k outperforms Ryzen 7950x.
There is a video here from LTT:
What are your thoughts?
Providers: Are you going to make these part of your product line offerings (VMs / Dedicated)?
Let's have an interesting discussion here.
Comments
i9-13900k Boost Clock 5.8Ghz out of the box.
Can reach 8.2Ghz overclocked with liquid nitrogen.
This CPU is amazing
I still prefer AMD though.
Just wait for next Spectre like speculative execution vulnerability & mitigation to knock Intel down another 10-30% in performance over time! I too would prefer AMD
@eva2000 If I recall correctly Ryzen does have a spectre vulnerability too. Just did a quick biased search and it seems it was confirmed by AMD. Not sure if the fix affected performance.
DDR5 is still expensive.
If you are buying DDR5 might as well buy it ECC @Shakib .
Last I checked recent gens intel i9 supports ECC.
If it ain't ECC is not production ready. I think this is something that has been said like forever, used to be and still is the norm. Not sure what's happening with providers suddenly saying that ECC advantages are a myth .
On the consumer side, I'm now seeing a few DDR5 hovering around the same price I purchased my DDR4 for. Of course, that's higher end DDR4 vs. lower end DDR5, but nice to see. 13th gen still supports DDR4 unlike AMD's latest, so that's another edge for Intel.
Maybe in a few more years we see people upgrading their breakers to keep up with these power demands.
@Hxxx , The idea of ECC memory is that it will protect your system from a potential crash by correcting any errors in the data. You will definitely need ECC RAM on older systems with older HDD/SSD models.
There isn't any error for our memory to correct because we are using newer gen Samsung NVMe that does have integrated LDPC ECC and it will protect our data integrity by correcting errors if error occurs.
On average we get approx. 250 to 300 days of uptime from each node before we do a kernel update and reboot.
But, sure. I will prioritize ECC during my next upgrade just for enthusiasts like you.
@jamz , i9-11900K GB5 single core score is 1904 on empty node.
i9-13900K does 2212. That's only 308 higher.
There is no point of upgrading to 13th gen without DDR5 RAM.
My current plan is to get new DDR5 systems in 2024, when things settle down so I could migrate my existing customers to new platform without charging anything extra. I might consider getting 13th gen CPUs on next year if pricing goes down significantly.
My opinions are based on facts and cost cutting measures. Correct me if I am wrong.
There is ECC in the storage drives and then there is the ECC in the RAM modules.
If you are running non-ECC RAM then you wouldn't know if your data was corrupted. Remember a bitflip doesn't necessarily crash your system. Maybe it flips a digit in an important transaction before it gets passed to the storage to be written. Once corrupted in RAM, well the storage ECC module will not correct what happened at the RAM modules level. Specially if is a new transaction that didn't existed before.
But thank for bringing that, there is indeed ECC in storage drives. Ideally a system should have ECC in both places.
@Shakib Yes thanks for considering ECC for the future.
However the DDR5 standard chips comes with on-die ECC. This is not the same as mainline ECC. It just prevent bitflip in the chip. However as soon as that data is moved is unprotected. Was watching this very well explained video about this misinterpretation. Very interesting. In summary is not the same ECC.
@Shakib in case it interest you.
Intel didn't start to support ECC on desktop cpu for nothing.
i think.
Just wanted to add, sure ECC won't matter to everybody. If you are hosting cat gifs or doing something unimportant with the server it wouldn't probably make a difference to you (the user).
However if you are deploying VM's to crunch data, do data science, run databases, do e-commerce, in my opinion, you should be running it on a server with ECC.
Yes some but AMD is generally has been less effected. With exception of RetBleed and Zen1. In fact AMD Ryzen 7000 zen4 is faster on Linux with the Kernel mitigations enabled https://www.phoronix.com/news/AMD-Zen-4-Mitigations-Off
and https://www.phoronix.com/review/amd-zen4-spectrev2
Track record wise, AMD is a safer bet for long term performance. And well ECC memory would be another reason for AMD side.
Oh Phoronix has i5 13600K and i9 13900K teething issues https://www.phoronix.com/review/intel-raptorlake-linux
Have you seen the power draw on these? They are crazy!
I have been burned by the lack of ECC memory in a previous laptop computer. I had inexpensive RAM with a rarely seen, intermittent defect - flipping a bit.
I first noticed the issue over a span of many months several years after I had purchased and installed the RAM. I was doing huge weekly downloads of backups to my home from a remote site. There were multiple files, and a typical file size ranged between 60 and 80 Gbytes. Some were larger. A typical download would run for over an hour at 100 mbits/sec. I hashed the files to ensure the integrity of the downloads. Once in a while the hashes did not match and I would have to repeat a download.
I started troubleshooting the internet connection first, but eventually figured out that it was defective RAM in my laptop. Most of the time, it ran well, but it flipped bits on rare occasions. Over the long period the RAM was installed, files on the local drive were corrupted with a bit error here and there, but it was so rare that I did not notice. My estimate was 1 bit error per 500 Gbytes from the downloads, possibly as low as 1 per Tbyte.
Obviously the bad RAM was also corrupted files on the laptop itself as they were written. The corrupted files are random and scattered on the hard drive. They are essentially impossible to detect by scanning. Fortunately, they are rare, but I have to live with them. True, a bit error in a .jpg file doesn't matter.
I replaced the RAM and the large file downloads became ultra-reliable. After that experience, I would prefer ECC RAM and would pay a reasonable premium for it. Emphasis on reasonable - RAM defects in personal computers are extremely rare.
This was not my first rodeo with RAM defects. The first RAM defect I saw was a "heisenbug" in a custom computer - the RAM did not quite live up to its spec. I do not remember whether we adjusted the refresh timing or got RAM that matched spec to fix that bug.
(For the record, I know how to properly handle circuit cards and chips to avoid damaging them, and I follow the appropriate procedures even at home.)
350w from an CPU alone. lmao. Couple that with the new RTX and you're nearing a 1kw.
I checked a few, Hardware Unboxed, Moore's Law Is Dead -- I did not bother with LTT or Gamer's Nexus since even titles seems like they have an bias towards Intel. LTT even picked up way faster (more expensive) RAM for 13900k (6800 vs AMD 6000 from screencap i saw -- pretty sure AMD could run 6800 too ...)
imho, 13900k is DOA.
You will only get the single thread performance benefits by going totally batshit bonkers on everything, and even then it's not enough to keep it from thermal throttling. Will be interesting to see sub-ambient cooling, but it looks like nothing else will keep it from throttling.
It costs hell of a lot more for that single task (single thread) when you account in the extra cooling required, vs. AMD choices where you can PICK which balance you need and a regular air cooler will be sufficient to max it out, at fraction of power consumption.
It all adds up, to get max out of 13900k you need to spend extra on all of this:
ie. If you go 13900k + DDR4 and gaming is #1, then 5800X3D will match it
Then we are going to get 3D V-Cache (X3D) versions of Zen4 in Q1 as well.
I don't see many scenarios where 13900k is better choice.
For 17-20 seconds with BEST AIO on the market, in a good case.
Not even slightest chance unless customer buys their own hardware OR pays way way way extra, like 3month HW ROI for us.
Intel is creating good cpus these days (thanks to AMD). But in my world with limited budget, brand is not important. I specify my budget and I never found myself having to decide between 2 choices.