SimForums.com Homepage
Forum Home Forum Home > General Discussion Forums > Hardware, Software, and Computer Technology
  New Posts New Posts RSS Feed - XMP
  FAQ FAQ  Forum Search   Register Register  Login Login

Topic ClosedXMP

 Post Reply Post Reply Page  123>
Author
Message
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Topic: XMP
    Posted: September-15-2018 at 8:09pm
I cant get a 4 stick Corsair set to run at 3333Mhz.
4x16Gb Sticks as approved by the QVL (except that our sticks are black and the QVL ones are red)

They run fine at 2133 but at XMP I can not boot and need to clear CMOS.

Now, this is on an AMD board which I know no one uses here, so I am not asking for specific settings, but...

Isnt XMP supposed to be a fully automated overock setting that should not require tweaking with voltages or timings?

Another thing is that the GTX 1080ti is not working in PCIe slot 1 (no display, not even the bios shows).
In PCIe 2 it works so I am leaving it there for now but I am wondering if PCI 1 is broken. Is there any way to test this?

thx.


Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-16-2018 at 5:37am
There is not a lot I can help you with.
 
First, what do you mean QVL's are black and yours are red? Color makes no difference but you must make sure the numbers the QVL show are exact with absolutely no deviation as well as confirm the number of sticks/speed/timing are a match to the QVL for the motherboard/processor.
 
As you said, this a AMD system. I have not used AMD since 2005. XMP is the code placed on the memory modules so they register correctly to the motherboard. In the past failure with XMP and Intel could come from improper DDR voltage. If the timing and speed is registered correctly but the voltage for the speed/timing is not, then the user must manually enter the BIOS and make sure the correct voltage for the speed/timing is in fact being set.
 
Past that, I would contact Corsair.
 
As for the video card not working in PCIe x16 Slot1.. if this were a Intel board, I would suspect a very defective motherboard however there may BIOS settings that need attention. Again, this is AMD and I can comment with any validity.
 
Take what I said above and look it over..   unfortunately I can not assist much further.
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-16-2018 at 6:44pm
Just a follow up...   XMP is not exactly a 'overclock' although the motherboard manufacture can consider any DRAM speed over 'X' for the motherboard model a (OC) of that system.
 
XMP simply makes it easier to set up the DRAM correctly. I do remember years back with Intel sometimes the motherboard did not read the chips on the memory modules correctly, or, the memory manufacture made a error in programming the chip and as mentioned above XMP would not correctly load the DRAM Voltage, but in some cases it also did not correctly load the timing or sub-timings requiring getting the settings from the memory manufacture and manual editing of the BIOS for one or both.
 
Corsair and the motherboard manufacture should be able to assist in that area.
 
It is also possible you simply got a flakey set of modules or a flakey motherboard.
 
You could try one stick at a time and setting XMP to see if it will run the correct speed/timing then try two. You will need to look at the motherboard manual to see which slots are to be used for single and dual operation.
 
Unless there is another PCIe x card in the system that is interfering with slot 1 or a setting in the BIOS that enables/disables the first x16 slot, as mentioned above I would suspect you may have a defective motherboard. I would look into that issue first as it may be why the memory isn't running XMP as it should by the QVL.
Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-17-2018 at 5:04pm
Hi Nick, thanks for jumping in!

With red versus blue I meant the following:
QVL approved memory:Corsair Vengence CMK64GX4M4B3333C16R
Our box sais:               Corsair Vengance CMK64GX4M4B3333C16

When I google them, I see red sticks for the ones with the serial number ending with „R“.
The ones we bought (without the „R“) are black.

At first I thought it would not matter red or black. But now that I look at the box, it does say INTEL XMP CERTIFIED :-(
maybe the red ones from the QVL are AMD XMP certified and I should have gotten them instead :-(

I will ask corsair

As for the PCIe problem, I will ask Asrock about that.
Even with nothing else is plugged in that uses PCIe, the PCIe1 would not work.
So as you said, the MB is either faulty or it is a bios setting.

Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-17-2018 at 6:52pm
Should be the same memory, just color is different
 
 
It could be the motherboard is not reading the XMP chip right and it has to all be manually entered in the BIOS including voltage.
 
Use CPUz and see what the SPD tab timing table says for XMP-1666.5 (3333) and then check in the BIOS and confirm it is the same for each. The voltage is important I don't know what that is. The motherboard may be defaulting the voltage too low
 
 
You may need to address the XMP dropdown in the BIOS and/or manually setup the dram. I came across this with respect to those sticks https://rog.asus.com/forum/showthread.php?97452-Corsair-CMK64GX4M4B3333C16-BIOS-Memory-Settings
 
 
 
 
Back to Top
AnkH View Drop Down
Intermediate Group
Intermediate Group


Joined: July-05-2013
Points: 84
Direct Link To This Post Posted: September-19-2018 at 6:04am
Sorry if I jump in here, but there is a related question bothering my mind since I switched to a 8700K build: what is more important for Prepar3d v4.x, the latency or the bandwith? I guess, both Play a role, so lets ask more specific: should I go for highest possible bandwith at the expense of the latency or accept lower bandwith with very good latency? Two Options I have, both would be a replacement of my current 3200MHz CL16 RAM:
 
1. replace it with 4000 CL19 modules and profit from the increased bandwith (+25%) and slightly better latency (-5%) or
 
2. replace it with 3200MHz CL14 modules and profit only from the almost 15% improved latency?
 
Do not ask why I did buy this 3200MHz CL16 RAM in the beginning, it was simply the only one available at that time. 
 
The above question is basically about the price tag. The 4000MHz CL19 modules are about 30% more expensive than the 3200MHz CL14 modules...
--------------------------------
Chris
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-19-2018 at 6:24pm
I will give you fast and dirty answer.
 
Don't waste your money on DDR4 4000 CL19.
 
The change from DDR4 3200 CL16 to CL14 has more value for the cost. <-- buy this and skip the rest of my post below but make sure to clock the system up too in order to really use the advantage.
 
 
 
 
 
====================================
 
 
 
The details...
 
There is no change from how lower latency means better performance on the same speed.
 
For some reason people focus on DDR Speed/CL and nothing else. Yes those two are the primary values to look for, the higher the speed and the lower the latency however if the other timing values are high then those can negate the advantage. Memory companies use that all the time.
 
So although absolutely we use DDR<speed> and CL as the primary way to judge, looking at the sub timings is important too. Lets say you have DDR4 3200 CL14 18-18-37 and CL14-14-14-31  You don't want the higher sub timing memory.
 
That being said, the higher the speed the better however if the timing bump is not reasonable at the higher speed, you are wasting your money. For DDR3 I would go by the rule of 1 tick up in CL for every (DDR/2) 130MHz. Because of the bandwidth change I can give DDR4 a little more wiggle room, but not much.
 
In order to be successful at that you must know the fastest timing @ a defined speed and go up from there.
 
Example:
 
1/((DDR<speed>/2)/1000) x CL = ns
 
DDR3 1600 (2*800MHz) CL6-6-5-24    = 7.5ns
 
266.5MHz + bump -  2 ticks up on CL
 
DDR3 2133 (2*1066.5MHz) CL8-10-9-31  = 7.5ns
 
133.5MHz + bump - 1 tick up on CL, no more
 
DDR3 2400 (2*1200MHz) CL9-11-11-31  =7.5ns
 
 
So between DDR3 1600 (800MHz) and DDR3 2400 (1200MHz) I raised the CL choice 3 clicks from 6 to 9 with a 400MHz bump. However on the last bump I took slightly higher sub timings to keep the CL lower. I have maintained 7.5ns with each choice but my bandwidth is going up.
 
What I can tell you is all 3 of those choices above were not cheap memory for their time. They were considered top speed stable @ low timing, just like buying DDR4 4800 CL17 (if you could get it) today except the memory manufactures are screwing everyone on the pricing today as compared to equally fast DDR3 in the past.
 
I probably paid an average of around 189-389 bucks for those 8-16GB kits over the years.
 
IMPORTANT: If you intend to clock and clock HIGH, It is FAR BETTER to run 2 sticks of memory when clocking that filling up the banks. Unless there is a need for quad support I always opt for 2 sticks @ GB. That means I have a better chance of running a higher CPU speed with stable memory speed/timing. 16GB is more than enough for most systems, 32 is an option. If after I obtain my desired high clock and it is proven stable, THEN I may opt to try adding memory.
 
Many times I purchase 4 sticks so I have spares of my good memory in case they are needed, and if after I define I have a stable clock I will test populating more slots with the same settings. The good memory never seems to last on the market very long so I get while I can.
 
 
 
 
Getting into DDR4.....  DeadThumbs Down
 
I personally wont buy a DDR4 system (already built and tested, then sold thank god) and I will demo why below. Here are the memory choices you outlined:
 
DDR4 4000 CL19
1/((4000/2)/1000) x 19 = 9.5ns
 
DDR4 3200 CL14
1/((3200/2)/1000) x 14 = 8.75ns
 
3200 CL14 is faster in timing and you are adding 5 steps UP in CL to obtain an additional 400Mhz (3200/2) (4000/2)   1600 to 2000
 
So here is where the 4000 CL19 sits on the chart as compared to the 3200 CL14
 
Higher up the chart is better
 
 
 
 
Real world =  In some situations you may see some advantage due to the cycle time and bandwidth and in others a disadvantage. It would depend on the application being applied but overall you just got ripped-off.
 
400MHz/130MHz = 3.07 ticks up from DDR4 3200 CL14 to DDR4 4000 CL17
 
So if I was interested in running something worth the change for Flight Sim, I would opt for DDR4 4000 CL17 (AND CL16 if I could get it!)
 
DDR4 4000 (PC4 32000)
• Timing 17-17-17-37
• CAS Latency 17
• Voltage 1.35V
 
And here is that memory on the chart with the others....
 
 
 
Of course if you don't want to pay to play, the 3200 CL14's are fine. https://www.newegg.com/Product/Product.aspx?Item=N82E16820313712
 
CL14-14-14-31 and that's not a bad price for 8x2GB (16GB)
 
 
The difference between DDR4 3200 CL16 and CL14 has far more value than going to 4000 @19.   3200 @16 sits at #134 on that chart as compared to 14 @ #90
 
Because you are not pushing your motherboard to run 4000 you will have a higher stable clocking potential so all the way around the 3200 C14 is the best choice, unless you want play with the big boys Big smile
 
 
There is NO SUCH THING AS VALUE FOR TOP END HIGH PERFORMANCE - you will never get the top end performance in memory unless you are willing to play/pay.
 
Unlike most, I don't mind doing some testing of modules to find the best ones not QVL listed... (Such as buy DDR4 4600 CLx and down-clock it for far better timing)  But most do not want to deal with their motherboard not running the sticks and sending product back. That is why if you are not into testing and tweaking it is best to go by the motherboard memory QVL and not deviate from the list.
 
Your MOTHERBOARD quality will define if it will run higher speed/low timing memory..  most times boards that cost 250+ are required for 4000+ speeds but not always.
 
 
Now, I have built a DDR4 setup and tested it. 4.8GHz @ DDR4 3200 CL14 I found no discernible difference between it and my i74770K @ 4.8GHz on DDR3 2400 CL9. As a matter of fact I found it doggier and it barely threw out benchmarks (that make a difference) much higher than my 5 year old system.
 
Lower is better
 DDR3 2400 CL9
1/((2400/2)/1000) x 9 = 7.5ns
 
DDR4 Equivalent:
In order to hit 7.5ns on that DDR4 chart I would have to run DDR4 4000 CL15 (it doesn't exist), however given the difference in bandwidth I can opt to go up in timing. At a difference of 800Mhz between 2400 (1200) and 4000 (2000) I would run 4000 at no more than CL17. That would be how I would cross my existing DDR3 2400 CL9 to DDR4 for equal or 'somewhat' better performance. That's how the industry is ripping the market off. Anyone who is running a DDR3 2400 CL9 system @ 4.5-4.8GHz is wasting their money buying a new system, and, being locked into Windows10.
 
 
From Haswell to today 4.8GHz is 4.8GHz, doesn't matter if its a 4770K or a 8700K unless one needs updated Intel instructions for professional A/V editing or engineering programs.... and that software hasn't caught completely up to the tech and flight sim (any version) doesn't care or use such features. I run both W7 and W10, and there is no difference between the two in how well a sim runs.
 
If and when they actually make the next real jump in tech and I am still alive, I will visit it again.
 
 
 
and to bring this to a close, the overall real target you are after is getting memory speed/timing and a CPU/cache speed that generates close to or LESS THAN 40ns in the test shown below.
 
Anything below 40 is smok'in hot fast (First Line, Last Readout - LATENCY)
 
If my memory READ-WRITE-COPY is say 50000-65000MB/s if my LATENCY is in the 65-75ns range then that awesome speed has to wait nearly double the time to be processed.
 
Where's the beef??
 
 
That is my current real world performance and has been for over 5 years
 
4.75GHz @ 4444 CPU cache (47x101BCLK) running DDR3 2424 CL9-11-11-31
 
DDR3 2424 CL9
1/((2424/2)/1000) x 9 = 7.4ns
 
 
 
All this being said..  you are not going to see major differences in real world performance running stock CPU speeds, standard cache speeds and standard BIOS settings.
 
Making use of the memory speed and timing goes past plugging it in and booting the computer with XMP setup.
 
Keep that in mind.
 
 
 
 
 
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-19-2018 at 7:44pm
..and just a reminder to everyone reading this. Just because a memory product has a high speed, low timing spec does not mean your motherboard will run it.
 
Some people roll the dice and try higher speed product, others stick to the QVL list and in both cases I have seen FAILS. You have a better shot running what is on the QVL list stable.
 
It also comes down to your clocking too. You may not be able to run 5GHz @ DDR4 4800
 
 
 
That being said....
 
 
===============================================
 

DDR4-4800MHz CL17 2x8GB – The Ultimate Combination of Frequency & Timing

Another ground breaking demo memory kit at the G.SKILL booth is the DDR4-4800MHz CL17-17-17-37 2x8GB Trident Z RGB memory kit that’s based on Samsung B-die, which showcases the lowest possible CAS latency timing at such high frequency. This extreme memory kit is demoed with the Intel® Core™ i7-8700K processor and ASUS ROG MAXIMUS X APEX motherboard.

================================================
 
That is not a cheap motherboard although I am sure it could be done on a less costly board, it cant be a budget board.
 
BUT WAIT!  Here is the exciting news....  Look at what they finally achieved!!!
 
DDR4 4800 CL17
1/((4800/2)/1000) x 17 = 7.1ns
 
heck that beats my real world DDR3 2400 by 0.3ns Confused where do I sign up?   
 
(although I WILL admit a 1200MHz boost in bandwidth at my desired timing mark IS VERY TEMPTING LOL
 
I'm not paying the extortion fee for it!
 
Big smile
 
=================================================
 
DDR4-4600MHz 4x8GB – The Fastest 32GB Memory Kit

In addition to the high-speed 16GB (2x8GB) kits, G.SKILL also is showing the fastest 32GB (4x8GB) memory kit ever seen at DDR4-4600MHz, while maintaining a low CAS latency at CL18-18-18-38. This extreme performance kit is also built with Samsung 8Gb ICs, and is shown alongside the Intel® Core™ i5-8600K processor and ASUS ROG MAXIMUS X HERO motherboard.

=================================================
 
Far more reasonable price for the board.
 
DDR4 4600 CL18
1/((4600/2)/1000) x 18  = 7.82ns
 
 
 
I would take this if the sticks were not ridiculously priced and I could actually get a CPU up to and over 5GHz on a decent CPU cache speed without a 3-500 dollar motherboard running that memory speed/timing
 
=======================================================
 
 
 
The part they don't tell you about all this stuff is you must have the hardware to run it and to take advantage of it, you must be clocking that CPU/cache.
 
If you cant do that, you wont see the value of your overpriced purchases!
 
 
 
 
 
 
We all lucked out back in 2013 with the 4770-4790K's, running 4.7-4.8GHz with GSkill DDR3 2400 CL9. All we had to do was update the video card and we get the same performance as these 8700's on top end *cough*-choke-puke) DDR4. LOL
 
 
It may be nice to squeeze another 300-400MHz out of a processor easier but I am not payin the entrance fee and being forced to run Windows10 to use it.
 
 
...oh, by the way I am not the only one who sees this DDR4 memory crap as a just that
 
QUOTE: DRAM shortages (*cough* price-fixing *cough*) have caused memory module prices to get out of hand. Some of the higher-clocked memory kits can be twice or even thrice as expensive as those running at reference or JEDEC clock speeds. 
 
 
 
 
 
 
Back to Top
AnkH View Drop Down
Intermediate Group
Intermediate Group


Joined: July-05-2013
Points: 84
Direct Link To This Post Posted: September-20-2018 at 3:37am
Ok, could have gone with the short answer, but your longer one was a more interesting read, thanks for that :-)
 
BTW: I currently use my 2x16GB sticks of 3200MHz CL16 RAM on a ASUS Maximus X Hero mainboard, my 8700K is delidded and clocked to 5.0GHz. This overclock I would like to keep, so the best thing for me is to opt for those 3200MHz CL14 2x16GB sticks where also the subtimings are good. Finding OC RAM is always a little bit tricky here in Switzerland, but I guess the available 2x16GB Kit from G.Skill F4-3200C14D-32GTZKW with 14-14-14-34 would be fine, no?
--------------------------------
Chris
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-20-2018 at 3:44am
will give you fast and dirty answer.
 
Don't waste your money on DDR4 4000 CL19.
 
The change from DDR4 3200 CL16 to CL14 has more value for the cost. <-- buy this and skip the rest of my post below but make sure to clock the system up too in order to really use the advantage.
 
 
===================================================
 
 
 
The reason I posted the details is because the last time I posted the fast and dirty answer on a highly technical subject, I got nothing but unprofessional opinionated responses, so I just want to make sure the fast and dirty response had the engineering to back it up in order to circumvent 5-10 long posts of argumentative amateur nonsense from forum lurkers.
 
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-20-2018 at 4:11am
Your goal is to get as close to 40ns <or less> on the AIDA Cache and Memory Benchmark.
 
You don't need the payware version to see that result.
 
Its funny they block the parts that don't mean real squat on the trial version. Although it is good to see the MB/s to be sure of other factors, all you should be concerned with is the latency in that test readout.
 
Get it to as close to 40ns or under on a stable clock @4.8 or above, and you have accomplished the goal.
 
 
 
 
 
first post.. that's how you do it.
 
Post #719 has that beat but as you can see the values I gave you are the gold. You will get the best result on the 4000 @ CL17 but you have to pay
 
 
Scroll down and try not to laugh at some of those results (forget AMD it doesn't read right in AIDA)
Back to Top
AnkH View Drop Down
Intermediate Group
Intermediate Group


Joined: July-05-2013
Points: 84
Direct Link To This Post Posted: September-20-2018 at 5:09am
Got it, thanks. I will first run this AiDA benchmark on my current rig to see how it performs and then I can estimate where to go with the numbers and the info in your explanation above. Thanks!
--------------------------------
Chris
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-20-2018 at 5:17am
Some of those guys are drag racing.. (post 719 for one) meaning although they can setup using timing and voltages that are not standard and get the result, they cant run 24/7 stable like that.
 
That is why I outline to buy the right speed/timing for the sticks and have the right board to run them.
 
The magic number has always been 7.5ns or lower at the highest speed possible, stable. It always seems to land that latency number right into the 40 or less range.
 
good luck~! Beer
 
 
 
 
Back to Top
Ted striker View Drop Down
Senior Member
Senior Member


Joined: January-24-2006
Location: Denver
Points: 141
Direct Link To This Post Posted: September-20-2018 at 7:38pm
Thanks for the detailed analysis of the current Memory/Cas situation. I was wondering where things sat currently with this. I am still running my 3770k at 4.5ghz with 2200mhz cas 9 memory that you helped me setup back in 2013.

I replaced the 780gtx with a 1080ti to power my UHD monitor. Now I have no performance penalty between HD and UHD, but the old CPU/motherboard is definitely the bottleneck now. So I'm back on the mousewheel hoping the 9900k will be the savior LOL.

Ted

Ted


3770K @4.5Ghz, Noctua NH-C12P, Asus Z77-V Deluxe, Corsair 2133-9-11-10-27-2T, 780 GTX, Win7-64 on 256gb Intel SSD, FSX, P3Dv3, P3Dv4 on 500gb Intel SSD, PC Power & Cooling 750W, Antec P193 case
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-21-2018 at 12:05am
Yea unfortunately Ted you are running a IvyB which means even though it was tuned up as best as possible here in 2013 you didn't get the advantages Haswell offers as well as full PCie3 support. So if you don't mind Winders10,.. a unthinkably horrible OS downgrade hardware upgrade is probably in your future.
 
 
I might be a bit daft but I think I recall you did hit or come very close to the ADIA 40ns memory latency mark back then.
 
simplified:
 
1/((2200/2)/1000) x 9 = 8.18ns.. touch high, (better than DDR4 4000 C17 on timing)  but works!
 
Use the info above..  I made it very easy to follow and figure out... but be ready to bust'ya wallet to get it done right. Big smile
 
 
 
BTW, on a personal note...  Did you ever get that drinking problem under control?  https://youtu.be/pl4plPGRG8o?t=5    LOL
 
 
Thumbs Up
Back to Top
funknnasty View Drop Down
New Member
New Member


Joined: April-17-2017
Points: 14
Direct Link To This Post Posted: September-21-2018 at 10:19am

gskill 3200 cl14  on win7  all 6 cores running at 5.0 with no avx offset and not delidded -uh huh

Oh, and the now ubiquitous P3d 4.3 settings I run would bring any Haswell to its knees ..... :-)

C'mon Nick get with the times, man :-)
Back to Top
Fly happy View Drop Down
Senior Member
Senior Member


Joined: October-10-2012
Location: Sweden
Points: 859
Direct Link To This Post Posted: September-21-2018 at 10:43am
That shows what the Apex is made of
Hans

W7/64 Ultimate, FSX Gold, SB-E 3930K @ 4.7, Sabertooth X79, GTX580, 4x2GB G.Skill RipjawsZ @ 2133-9-11-10-28-1T, Corsair H110+Obsidian 900D,Seasonic P-1000, GEX, UTX, UT2, REX, S-Tech LC, NGX.
Back to Top
Ted striker View Drop Down
Senior Member
Senior Member


Joined: January-24-2006
Location: Denver
Points: 141
Direct Link To This Post Posted: September-21-2018 at 1:06pm
You guys got my curiosity up so I dug up my old Aida64 runs. I am using 2 x 4gb Corsair 2133 9-11-10-27 and was able to push it to 2400 9-11-10-30 1T for a read bandwith of 36,250 MB/s and a latency of 36.9.  It's still running like a champ but my bandwidth is close to half that of funknnasty. What has kept me from upgrading is having to switch to W10. As I recall funknnasty is also running W7 on his system so I'm hoping I will be able to pull off the same with the 9900k.

Regarding that drinking problem, that is a feature. When I reach my limit I am not capable of consuming anymore. Embarrassed

Ted
3770K @4.5Ghz, Noctua NH-C12P, Asus Z77-V Deluxe, Corsair 2133-9-11-10-27-2T, 780 GTX, Win7-64 on 256gb Intel SSD, FSX, P3Dv3, P3Dv4 on 500gb Intel SSD, PC Power & Cooling 750W, Antec P193 case
Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-21-2018 at 1:23pm
Great explanations above Nick!

Thx
Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-21-2018 at 3:52pm
Originally posted by funknnasty funknnasty wrote:


gskill 3200 cl14  on win7  all 6 cores running at 5.0 with no avx offset and not delidded -uh huh

Oh, and the now ubiquitous P3d 4.3 settings I run would bring any Haswell to its knees ..... :-)

C'mon Nick get with the times, man :-)
 
 
Hey,  Censored     did I say it couldn't be done???   Clown 
 
Originally posted by NickN NickN wrote:

 
Unlike most, I don't mind doing some testing of modules to find the best ones...
 
 
 
Screw the times.
 
 
Of course referencing the image above... that's how you do it...  but its still 8.185ns  and look at your L3 latency...    SHAME ON YOU! 
 
(Nick is shamelessly ignoring the other differences for comic relief)
 
 
 Clown  QUIT SLACKING AND GET WITH THE PROGRAM! Beer 
 
 
 
It is obvious you know what the score is and how to achieve it. Big smile
 
LOL
 
Most people would never attempt or know how to up-clock (Overclock) memory and properly test it for stability.
 
 
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-21-2018 at 3:55pm
Originally posted by Ted striker Ted striker wrote:

You guys got my curiosity up so I dug up my old Aida64 runs. I am using 2 x 4gb Corsair 2133 9-11-10-27 and was able to push it to 2400 9-11-10-30 1T for a read bandwith of 36,250 MB/s and a latency of 36.9.  It's still running like a champ but my bandwidth is close to half that of funknnasty. What has kept me from upgrading is having to switch to W10. As I recall funknnasty is also running W7 on his system so I'm hoping I will be able to pull off the same with the 9900k.

Regarding that drinking problem, that is a feature. When I reach my limit I am not capable of consuming anymore. Embarrassed

Ted
 
 
Yes I clearly recall you followed my outline and got that memory running right around or below 40ns.
 
 
Yes it is possible to run W7 above Skylake but you do need to make sure any/all drivers needed for the board itself as well as all devices/cards are available or know how to run a hack setup to get them installed.
 
 
If you don't know how or if and become frustrated, it could lead right back to relapse, so please be careful
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-21-2018 at 4:20pm
Originally posted by Fly happy Fly happy wrote:

That shows what the Apex is made of
 
Yes it does, currently over 500 USD on Amazon
Back to Top
Ted striker View Drop Down
Senior Member
Senior Member


Joined: January-24-2006
Location: Denver
Points: 141
Direct Link To This Post Posted: September-21-2018 at 5:02pm
Originally posted by NickN NickN wrote:

 
If you don't know how or if and become frustrated, it could lead right back to relapse, so please be careful


The worst downside is if I fail I will have to buy and use W10. My sim computer is only used for flightsim so the Microsoft data spys will not get much except perhaps thinking that maybe they should have kept developing the ESP franchise.

Originally posted by NickN NickN wrote:

 
Yes it does, currently over 500 USD on Amazon


Yikes, it might cost me more than I thought to stay with W7 Shocked

Ted
3770K @4.5Ghz, Noctua NH-C12P, Asus Z77-V Deluxe, Corsair 2133-9-11-10-27-2T, 780 GTX, Win7-64 on 256gb Intel SSD, FSX, P3Dv3, P3Dv4 on 500gb Intel SSD, PC Power & Cooling 750W, Antec P193 case
Back to Top
funknnasty View Drop Down
New Member
New Member


Joined: April-17-2017
Points: 14
Direct Link To This Post Posted: September-21-2018 at 9:08pm
 
 

 

 

 
 
Screw the times.
 
 
Of course referencing the image above... that's how you do it...  but its still 8.185ns  and look at your L3 latency...    SHAME ON YOU! 
 
(Nick is shamelessly ignoring the other differences for comic relief)
 
 
 Clown  QUIT SLACKING AND GET WITH THE PROGRAM! Beer 
 
 
 
Oh stop it.  I'm just trying to break up the Haswell love fest over here. The orgy has to stop.
You know (or not) there's been lot's of discussion concerning blurries, autogen, stutters and FPS issues over at the AVSIM forums lately   ...and I'll be darned if not less than 90% of the posters of concern don't have Haswells.  ...Somebody has to step up and speak out against this insanity. It's my civic duty ...In the meantime I'll go searching for that last half of a nano second. :-)
Back to Top
funknnasty View Drop Down
New Member
New Member


Joined: April-17-2017
Points: 14
Direct Link To This Post Posted: September-21-2018 at 9:30pm
Originally posted by Ted striker Ted striker wrote:

Originally posted by NickN NickN wrote:

 
If you don't know how or if and become frustrated, it could lead right back to relapse, so please be careful


The worst downside is if I fail I will have to buy and use W10. My sim computer is only used for flightsim so the Microsoft data spys will not get much except perhaps thinking that maybe they should have kept developing the ESP franchise.

Originally posted by NickN NickN wrote:

 
Yes it does, currently over 500 USD on Amazon


Yikes, it might cost me more than I thought to stay with W7 Shocked

Ted


I paid $335 USD for mine. 

Now if you had an Apex with a PS/2 keyboard and mouse you could, in very simple and basic terms, pull out the win7 drive from your Haswell and just plug into your Coffee Lake ...oh, it will work. As far as installing fresh ....the PS/2 ports come in handy  ...win7 will recognize them during install ...heck, the mobo even has a USB 2.0 header if needed.  side note on PS/2 ports: I believe the use of the PS/2 ports help minimize mouse input lag while in the cockpit of your favorite ride.

I really don't think you'll have any trouble with the Win7 install. But if troube comes, the other cool feature that comes with the Apex is the duel bios ....making it ultra easy for Win10 and 7 to coexist on the same puter.


Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-22-2018 at 4:46am
If I may go back to my original problem for a second,... I got a reply from Asrock support. 
And I am thankfull for that but at the same time confused about the PCIE part. 

I pasted their reply below, but basically they offer that the GPU could be broken and thus will work at a PCIEx8 slot bit will not work at a PCIEx16 slot.

This sound strange to me. A GPU is either broken or it works. Can it really work only on half the lanes?

Furthermore, they claim that only PCIE1 and PCIE4 on our MB are connected with 16 lanes.
But the PDF for our MB reads:
4 x PCI Express 3.0 x16 Slots (PCIE1/PCIE2/PCIE4/PCIE5: single at x16 (PCIE1); dual at x16 (PCIE1) / x16 (PCIE4); triple at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4); quad at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4) / x8 (PCIE5))*

So,
4 x PCI Express 3.0 x16 Slots
That would mean each of the 4 slots has 16 lanes (as long as you use a single GPU) correct?
(if only 2 slots have 16 lanes and the other 2 slots have only 8 lanes should the specs then not say 2x PCIEx16  and 2x PCIEx8?)
So it seems to me that with no other hardware on the MB (except a C drive) the GPU should work with full bandwidth 16 lanes in any PCIE slot right?





Asrock reply:

Hello,

 

Graphics card in PCIE1

 

The problem with the graphics card in PCIE1 is not a known issue. It could be related to the graphics card, motherboard / CPU socket, or the CPU.

During testing please do not install other PCIe cards.

Disconnect all storage devices (M.2, SATA, USB). Without any storage device connected the system should boot into BIOS setup automatically, and once in BIOS you will normally see Ab on the debug display.

Please use default BIOS settings (load via F9 in BIOS).

With every test please give the system a minute or 2 to see if it boots.

 

1. PCIE1 and PCIE4 are connected to the CPU via 16 lanes each.

PCIE2 and PCIE5 are connected to the CPU via 8 lanes each.

 

Please test with the GTX 1080Ti in PCIE4. If there you get no screen output either then it looks like there is a problem with the graphics card. Please try to test it in another system in the main PCIe slot (with 16 lanes available and no other PCIe cards present in that system. Or try another graphics card in slot PCIE1 and try it in PCIE4 as well.

 

2. If the system works fine with the graphics card in PCIE4 then it appears to be motherboard or CPU related.

Then please carefully open the CPU socket. In clear light under different angles check the socket for possible bent or broken pins. Looks for anything that does not match the pattern of the pins. Also make sure there is no thermal paste on any socket pin or CPU contact.

 

If you find any damage to the socket then please let me know.

 

3. If the problem is not related to the graphics card itself or to a damaged socket, then it should be either a faulty CPU or motherboard. I can imagine you do not have another Threadripper CPU to test with, so in that case I recommend checking with the seller if they can test CPU and motherboard in the same way for you, and try another CPU/motherboard.

 

XMP profile

 

DRAM frequencies above DDR4-2667 are in the OC range. Result may vary and depend on the combination of DRAM, CPU, motherboard, AGESA code and BIOS. To find the limit for this system with 4 modules please:

- Install the modules in slots A2/B2/C2/D2.

- Load default BIOS settings via F9.

- Load the XMP profile.

- Then set the DRAM Frequency to DDR4-2667.

- Test the stability (or simply check if it boots first).

- If the system is stable then change the DRAM frequency to DDR4-2933. Test again.

- If the system is stable then change the DRAM frequency to DDR4-3200. Test again.

 

If you want you can repeat the test with only 2 modules, in A2 and B2. Maybe you can reach a higher frequency with 2 modules. But I assume you want to stick to 4 modules.

 

Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-22-2018 at 8:15pm
Originally posted by 777simmer 777simmer wrote:

 
I pasted their reply below, but basically they offer that the GPU could be broken and thus will work at a PCIEx8 slot bit will not work at a PCIEx16 slot.
 
This sound strange to me. A GPU is either broken or it works. Can it really work only on half the lanes?
Furthermore, they claim that only PCIE1 and PCIE4 on our MB are connected with 16 lanes.
But the PDF for our MB reads:
 
4 x PCI Express 3.0 x16 Slots (PCIE1/PCIE2/PCIE4/PCIE5: single at x16 (PCIE1); dual at x16 (PCIE1) / x16 (PCIE4); triple at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4); quad at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4) / x8 (PCIE5))*
So,
4 x PCI Express 3.0 x16 Slots
That would mean each of the 4 slots has 16 lanes (as long as you use a single GPU) correct?
(if only 2 slots have 16 lanes and the other 2 slots have only 8 lanes should the specs then not say 2x PCIEx16  and 2x PCIEx8?)
 
So it seems to me that with no other hardware on the MB (except a C drive) the GPU should work with full bandwidth 16 lanes in any PCIE slot right?

 

 
 
You are not reading the PDF correctly. A x16 SLOT is the physical size and design of the slot. It means that a x16 PCIe card will plug into it, but that does not mean it runs x16 or even 3.0.
 
 
So;
4 x PCI Express 3.0 x16 Slots - This means there are 4 physical slots that fit x16 cards and will run PCIe 3.0.
 
 
single at x16 (PCIE1); this is @ x16
 
dual at x16 (PCIE1) / x16 (PCIE4); these 2 run @ x16
 
triple at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4); this means PCIe2 will run x8
 
quad at x16 (PCIE1) / x8 (PCIE2) / x16 (PCIE4) / x8 (PCIE5))* this means PCIe2 and PCIe5 will run x8 and I suspect from the first statement PCIe5 is either not PCIe 3.0 or it is not PCIe x16 (its a smaller slot) unless this is a typo 4 x PCI Express 3.0 x16 Slots
 
 
So the only slots that run x16 3.0 is #1 and #4. That is why they asked you to test slot 4 with your video card.
 
The outline they gave you to test and check is correct.
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-22-2018 at 8:24pm
Originally posted by funknnasty funknnasty wrote:

 

You know (or not) there's been lot's of discussion concerning blurries, autogen, stutters and FPS issues over at the AVSIM forums lately   ...and I'll be darned if not less than 90% of the posters of concern don't have Haswells.
 
 
I hate to bust your bubble but I have been watching over there and the issue isn't the hardware. Its the application itself with (possible) other poor choices.
 
4.2GHz Haswell is to P3D/FSX as 4.2GHz Coffee Lake is to P3D/FSX
 
There can be defined differences in how the application responds in the memory choice and setup of it as well as choice and setup of the video adapter. That has been true with FSX, FSXSE and P3D, always has been. The speed at which scenery loads and sharpens up as it loads and as the scene is approached by the aircraft as well as smooth frame transitions.
 
But not to the point where the entire scene goes blurry and stays that way. Or becomes a stutter-fest.
 
 
In the past that was a combination of the speed of the entire system including the storage and usually included poor consideration for running programs and OS operations (services) running that either were not needed or were hogging resources. Then there is the user pouring on the addons on top of it all or hammering the video card with SGSS or other hacks and shader changes.
 
A lot of people just assumed they could install Windows throw in their security programs and other software without even looking at what was being booted/loaded, and, since their hardware fell into the Flightsim list of required hardware they could take off and run the sim just fine.
 
Well, nothing has changed and to be quite frank to run the additions that have gone into P3D and that everyone is slamming on top of the addon developers who are slamming more of those features and compound the issues, on a render engine and platform that was never designed to render such things efficiently. In that it is possible to overrun the most powerful graphics cards made today, not because it is advanced but because it is NOT being generated in an advanced render engine technology.
 
 
My advice to those having massive stutter or blurry issues is, regardless of the hardware, get rid of the addons, get rid of the shader hack programs, get rid of everything other than default P3D then reset the video setting profiles back to default other than set PREFER MAXIMUM PERFORMANCE for the power management, get rid of any external application regardless of use, and retest.
 
Over the same area these stutters or blurries existed, do they still exist? Y/N
 
IF YES; You may want to take the step of reinstalling the sim clean first and also make sure there are no security scans from Windows or 3rd party security programs going on. Exclude all sim directories from Windows and 3rd party security programs. Problems still?.. take it over to the P3D support forum and outline the issue.
 
 
IF NO; Start adding things back in one by one and carefully testing between and don't slam SGSS-AA settings into the mix. That is added back last and only if absolutely needed. If it is needed or the user just wont do without it, then something else has to come down to allow it.
 
And this 4K binge going on?  Pro's keep it at 1080p and some move up to 2K but never higher.
 
 
Switching from Haswell to Coffee Lake isn't going to change what they are seeing, unless of course in that switch they decide to throw down for the better SSD drive, the better video card and DDR4000 CL17 memory with the motherboard and PSU to run it, and, don't make the same mistakes again reloading addons and using the sim.
 
I would never suggest to anyone that if they throw several thousand dollars at P3Dv4 with a CPU and memory platform change it will fix their problems, because that is simply not true.
 
Everyone over there is a beta tester for LM. The sooner that is clearly understood by all and what that means, the better.
 
 
 
I don't have stutter and blur issues and I am not running top dollar 5GHz DDR4-4200 @ 4K.
 
 
 
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-22-2018 at 9:50pm
Adding to what I posted above about this....
 
 
 
 
Originally posted by 777simmer 777simmer wrote:

 

Please test with the GTX 1080Ti in PCIE4. If there you get no screen output either then it looks like there is a problem with the graphics card. Please try to test it in another system in the main PCIe slot (with 16 lanes available and no other PCIe cards present in that system. Or try another graphics card in slot PCIE1 and try it in PCIE4 as well.

 

2. If the system works fine with the graphics card in PCIE4 then it appears to be motherboard or CPU related.

Then please carefully open the CPU socket. In clear light under different angles check the socket for possible bent or broken pins. Looks for anything that does not match the pattern of the pins. Also make sure there is no thermal paste on any socket pin or CPU contact.

 

If you find any damage to the socket then please let me know.

 

3. If the problem is not related to the graphics card itself or to a damaged socket, then it should be either a faulty CPU or motherboard. I can imagine you do not have another Threadripper CPU to test with, so in that case I recommend checking with the seller if they can test CPU and motherboard in the same way for you, and try another CPU/motherboard.

 

 

 
 
Yes, a single video card can run x8 in a slot defined to run @ x16 and this can be hardware or a driver issues, however I have never seen a BLACK SCREEN and doesn't work at all unless the hardware (motherboard) is not working.
 
As they said, if you plug into PCIe4 and get a screen and verify it is running x16 then most likely PCIe1 slot is physically defective or the circuits to it are defective.
 
If you get no screen or BLACK screen as you do in Slot1 then there is still the possibility of a defective board but they are saying a CPU socket with bent pins can cause it. I HAVE HEARD of CPU bent pins causing memory sticks to not show up. It was a common issue with folks who built systems in the past that they either received a board that had bent pins or they bent them installing the CPU.
 
 
I have never heard of a PCIe slot(s) not working on CPU pin problems but that does not mean it isn't possible.
 
They are also asking that you swap the card into another system (Slot1) and see if it works. My guess is it will but you need to confirm that.
 
 
Everything they outlined to you looks correct for checking all this although in my opinion, if that card checks OK in another system I would just send the board back for a replacement and not monkey around with it.
 
If as they suggest the seller is willing to test the CPU/Motherboard, replace what is defective and send it back,... that is another option.
 
 
 
 
 
 
Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-22-2018 at 11:04pm
Ok I understand.

I feel kinda stupid that I did not know PCIEx16 meant 16 Physical lanes only.
(On my defense, I have never needed that MB info because I have never worked with more than one GPU. so I have allways needed slot 1 only and never worried how many lanes other GPU slots had)

We shall test as instructed :-)

Thanks!
Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-22-2018 at 11:13pm
Yea, that can be confusing so don't be too hard yourself. Just remember it is a form factor slot design first, meaning the size and mechanicals of the slot and the number of contacts. 
Then from there, what it will and wont do is based on the motherboard and the CPU.
 
There was a time when folks thought x16+x16 = 2x x16 but instead learned when 2 x16 cards were in the slots they both defaulted back to x8 in Windows. The just didn't read the motherboard manual right. They usually provide a chart for that.
 
You can check what your card is doing using GPUz. It shows the BUS INTERFACE readout and remember you must load the card with a test to get it to display its full render output... just click the render test button (question mark next to the readout '?'), small box appears, run the test (not full screen) and look at the output in BUS INTERFACE
 
It will show you what your card is running when in render load mode.
 
 
Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-23-2018 at 3:20am
Ah yes, GPUz and CPUz, that was another thing I wanted to ask!

Do does apps read correctly on an AMD system?

(Now that I have inherreted my brothers old PC (HASWELL 4770k) I was going through your Haswell overclocking guide. I have checked on AIDA64 and they have updated that to work better with AMD. But I dont know yet about OCCT, CPUz and GPUz).

edit: I did some digging and it seems that CPUz and GPUz should work with AMD motherboards. AIDA64 should work as well for the most part. OCCT I still have investigate. 
Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
funknnasty View Drop Down
New Member
New Member


Joined: April-17-2017
Points: 14
Direct Link To This Post Posted: September-23-2018 at 11:40am

 
I hate to bust your bubble but I have been watching over there and the issue isn't the hardware. Its the application itself with (possible) other poor choices.

I think we would both agree you're the smartest man in the room. Not unlike the man from Oz with his smoke and mirror show -with regards to this thread.
 
4.2GHz Haswell is to P3D/FSX as 4.2GHz Coffee Lake is to P3D/FSX

Really? Just a coincidence that when I post a P3d 4.3 performance video the Avsim forum is flooded with blurries, stutters and autogen complaints? Virtually all complaints are coming 4 core users with most being Haswell users. <----- do these users complain because they've been told by many that the Haswell performance is equal to that of a six core Coffee lake? 
 
There can be defined differences in how the application responds in the memory choice and setup of it as well as choice and setup of the video adapter. That has been true with FSX, FSXSE and P3D, always has been. The speed at which scenery loads and sharpens up as it loads and as the scene is approached by the aircraft as well as smooth frame transitions.
 
But not to the point where the entire scene goes blurry and stays that way. Or becomes a stutter-fest.
 
Why don't we tell the of people of Oz, in simple flipping terms, that the bandwidth of a 5.0 Ghz Coffee Lake running 4000 c17 memory is double that of a Haswell?  You know, like while your ddr3 can pull an order and get it to the shipping docks in a fraction of a nano second faster than my coffee lake, my coffee lake will get nearly two orders to the customer's receiving docks in less time than that single order with ddr3 at 2400 cl 9.
 
 
Well, nothing has changed and to be quite frank to run the additions that have gone into P3D and that everyone is slamming on top of the addon developers who are slamming more of those features and compound the issues, on a render engine and platform that was never designed to render such things efficiently. In that it is possible to overrun the most powerful graphics cards made today, not because it is advanced but because it is NOT being generated in an advanced render engine technology.
 
I disagree ....well to a point. Yes, there is some legacy stuff that will limit performance, but I can assure you that one will not appreciate the power of p3D 4.3 with a Haswell ...or any 4 core setup for that matter. 
 
My advice to those having massive stutter or blurry issues is, regardless of the hardware, get rid of the addons, get rid of the shader hack programs, get rid of everything other than default P3D then reset the video setting profiles back to default other than set PREFER MAXIMUM PERFORMANCE for the power management, get rid of any external application regardless of use, and retest.

Or they could bring the sim settings in line with the capacity of the hardware they use, no?
 
 
I would never suggest to anyone that if they throw several thousand dollars at P3Dv4 with a CPU and memory platform change it will fix their problems, because that is simply not true.

Agree. new instruments won't help a bad conductor.

  
I don't have stutter and blur issues and I am not running top dollar 5GHz DDR4-4200 @ 4K.

One of the perks of being the smartest man the room? :-)


Look, <stealing from an old beer commercial> you could have a good time drinking Haswell but why take the chance.
 
 
 
 
 
[/QUOTE]
Back to Top
funknnasty View Drop Down
New Member
New Member


Joined: April-17-2017
Points: 14
Direct Link To This Post Posted: September-23-2018 at 11:59am
Originally posted by funknnasty funknnasty wrote:


 
I hate to bust your bubble but I have been watching over there and the issue isn't the hardware. Its the application itself with (possible) other poor choices.

I think we would both agree you're the smartest man in the room. But not unlike the man from Oz with his smoke and mirror show -with regards to this thread.
 
4.2GHz Haswell is to P3D/FSX as 4.2GHz Coffee Lake is to P3D/FSX

Really? Just a coincidence that when I post a P3d 4.3 performance video the Avsim forum is flooded with blurries, stutters and autogen complaints? Virtually all complaints are coming from 4 core users with most being Haswell users. <----- do these users complain because they've been told by many that the Haswell performance is equal to that of a six core Coffee lake? 
 
There can be defined differences in how the application responds in the memory choice and setup of it as well as choice and setup of the video adapter. That has been true with FSX, FSXSE and P3D, always has been. The speed at which scenery loads and sharpens up as it loads and as the scene is approached by the aircraft as well as smooth frame transitions.
 
But not to the point where the entire scene goes blurry and stays that way. Or becomes a stutter-fest.
 
Why don't we tell the of people of Oz, in simple flipping terms, that the bandwidth of a 5.0 Ghz Coffee Lake running 4000 c17 memory is double that of a Haswell?  You know, like while your ddr3 can pull an order and get it to the shipping docks in a fraction of a nano second faster than my coffee lake, my coffee lake will get nearly two orders to the customer's receiving docks in less time than that single order with ddr3 at 2400 cl 9.
 
 
Well, nothing has changed and to be quite frank to run the additions that have gone into P3D and that everyone is slamming on top of the addon developers who are slamming more of those features and compound the issues, on a render engine and platform that was never designed to render such things efficiently. In that it is possible to overrun the most powerful graphics cards made today, not because it is advanced but because it is NOT being generated in an advanced render engine technology.
 
I disagree ....well to a point. Yes, there is some legacy stuff that will limit performance, but I can assure you that one will not appreciate the power of p3D 4.3 with a Haswell ...or any 4 core setup for that matter. 
 
My advice to those having massive stutter or blurry issues is, regardless of the hardware, get rid of the addons, get rid of the shader hack programs, get rid of everything other than default P3D then reset the video setting profiles back to default other than set PREFER MAXIMUM PERFORMANCE for the power management, get rid of any external application regardless of use, and retest.

Or they could bring the sim settings in line with the capacity of the hardware they use, no?
 
 
I would never suggest to anyone that if they throw several thousand dollars at P3Dv4 with a CPU and memory platform change it will fix their problems, because that is simply not true.

Agree. new instruments won't help a bad conductor.

  
I don't have stutter and blur issues and I am not running top dollar 5GHz DDR4-4200 @ 4K.

One of the perks of being the smartest man the room? :-)


Look, <stealing from an old beer commercial> you could have a good time drinking Haswell but why take the chance.
 
 
 
 
 
[/QUOTE]
Back to Top
777simmer View Drop Down
Senior Member
Senior Member


Joined: May-08-2012
Location: Vienna
Points: 1942
Direct Link To This Post Posted: September-23-2018 at 1:18pm
@funknnasty

I have yet to see a thread on AVSIM that does not end in “I know better than you” :-(
Kinda like what is flaming up here right now...unless I am misinterpreting your intentions.
The exceptions are those threads where a real world specialist (for example a pilot) jumps into the discussion. Then people shut up and stop making things up. Probably just out of self preservation, but whatever, as long as it works...
I dont know what this trade is called that drives people with no clue to fabricate stories just to seem knowledgeable, but I cant stand it and so I looked elsewhere for advice

Over here there is much less of that “I know better than you” going on. 
And the reason for that is, just like I mentioned above, that there is someone with true knowledge guiding the discussions.

As far as hardware discussion goes, that someone is what AVSIM lacks, and thus the community there does not follow a single guide that results in a high performance PC with a blurry free and stutter free FSX experience.
Instead every one is doeing its own thing and everyone knows better.
And then they wonder why it does not work and the “told you so” and “you are doing it all wrong” starts going again.

I am sure you know what you are doing and the result is that you have a smooth P3D experience.
I can also inmagine that a new PC (yours) probably has the potential to do better than an older (Hasswell) platform. But without some guidance of how to properly combine this newer hardware and guidance on how to create a high performance Win10 system, many of us will never see the better performance.
So rather than blaming a certain platform, I would blame the setup of the user.

Just like I did not have a great FSX experience untill I started following the tips and tricks from here (and only here), I can imagine that many AVSIM members with blurries and stutters would benefit greatly from following a single guide from someone who knows what they are doing.
Maybe you can be that guy over at AVSIM hey?

Anyway, rather than having a “who knows better” discussion, it would be nice if we can keep this thread going on memory, memory bios settings, how to select memory in the future and stuff like that 

thanks


Rob

i7 3770k 4.5Ghz, Asus Max V Formula,GTX780 (previously 680), 8Gb GskillDDR3 2400@9-11-11-31, Change:Win7 64bit on WD Raptor and FSX on SSD since Jan2014.

Untill Sept 2012: Core2Duo E8500@3.9Ghz
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-23-2018 at 3:11pm
No worries Rob. I think its funny to tell you the truth and I am sure he is just bustin my chops. LOLLOLLOLLOL
 
I noticed my original speech from years ago on memory speed/timing =  latency has been practiced. Can't miss that. Of course perhaps that wasn't a result of my teachings and it was figured out independently. More power to the user!
 
 
Where did I ever say a 4 core was better than a 6? As a matter of fact going back to the days before P3D (980x days) I said quite clearly that the application can make better use of a 6 core proc..   believe I mentioned that again in the bible thread circa 2013?
 
Glad my suggestion was followed.
 
==============================================
 
A quad core should be the minimum number for cores used for FSX.


In the future if 6 core processors become a norm for the non-extreme class offering, then go for the 6 over the quad. Although 8 core non-extreme class solutions could also become a norm in the future, anything over 8 cores, based on how much data FSX is threading really is a 'total' waste of money.

 
=============================================
 
 
... past that and even today with P3D4 whatever, more than 6 is a waste. You might squeeze a bit more with a 8 but not worth it even with the latest P3D.
 
And even with that, I am NOT the smartest guy in the room! 
 
LOLBeer
 
 
but, I also don't have stutters and blurries on a 4 core. Big smile
 
 
I am not about to spend *another* 2-3000 bucks to run an application that is stuck in ESP render engine world and quite frankly is heading fast forward to...   VR
 
 
 
  Yes Alex, I would like take my next paycheck and buy a clue!
 
--- And there is.....   THE DAILY DOUBLE!!
 
Name the application people have and continue to purchase over and over again, some for a ridiculous cost, but have also spent more than enough over the years on the hardware to run it and the addons that they could have acquired a real pilot license, possibly financed the ownership of their own aircraft, or, flown around the 'real' world making regular stops as a first class passenger.....
 
 Clown
 
 
Back to Top
AnkH View Drop Down
Intermediate Group
Intermediate Group


Joined: July-05-2013
Points: 84
Direct Link To This Post Posted: September-24-2018 at 12:31pm
Ok, that's how it looks like on my rig:


Seems that those CL16 modules are pretty far away from the 40ns sweetspot, no? CL14 modules will really provide me 40ns?

Then, I wonder if I loose any sim performance (P3Dv4.3) due to the fact that my northbridge clock is rather low compared to the one posted above?

Thanks for the help anyway,

Regards
Chris
--------------------------------
Chris
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-24-2018 at 4:21pm
Originally posted by AnkH AnkH wrote:


 
Seems that those CL16 modules are pretty far away from the 40ns sweetspot, no? CL14 modules will really provide me 40ns?
 
I never said the 3200 CL14 modules would provide 40ns or less, I said that is the target to try and obtain. Going from CL16 to CL14 with proper sub timings will drop the latency and those sticks will run far better, but I can't estimate the overall latency change because it is not just the CL it's the speed of the memory too with the CPU and cache clock.
 
Look at your current memory.. its not CL16-16-16-34 its CL16-18-18-38 with your current memory you are at further disadvantage as the sub-timing is 2 clicks higher. You should obtain CL14-14-14-34 and do not sacrifice sub timing. For 32GB that's this; https://www.newegg.com/Product/Product.aspx?Item=N82E16820232206
 
 
I did say that if it were me I would go for the DDR4 4000 @ CL17 which would put me close to 40ns without any need to clock the memory or push it that hard. But my motherboard must also allow that memory speed -and- my CPU clock to run stable. 
 
 
 
FunkynNasty shows that with 4154 CL17 he is under 40 @ 5Ghz and a 4700 cache speed. I would say 4000 CL17-17-17 would default to right around 40 but again it all comes down to the entire system and setup.
 
FunkynNasty did not buy 4000 memory. According to him and his test readout he did it by overclocking the GSkill 3200 CL14 memory sticks to  4154 CL17 (post above)https://www.simforums.com/forums/xmp_topic60552_post385203.html#385203 and he also clocked his northbridge up although the NB not the primary reason for the <40ns latency, getting it somewhat inline with the CPU speed is another bump in performance.
 
I don't know what he set in the way of voltage or how hot his sticks are running. Going from 3200 to 4154 is quite a jump from their default and something not typically seen in memory up-clocking. I never attempted it myself so I do not know the level of risk involved with his memory overclock.
 
He is also running a expensive high-end motherboard. Such boards are physically designed to be used for clocking which can make a difference in being able to obtain such a high memory overclock, stable. They are also typically made to run top-end memory speed where other motherboards may not be able to run DDR4 4000 even if it is the default of the sticks and not clocked.
 
It was nice to read about Funk-n-Nasty's setup while he tooted his horn. What he didn't do was clearly define what he achieved was because he spent top dollar on the motherboard. He did not say or specify where he got the memory overclock information from he used, if he was exceeding the manufacture specs for voltage to do it, nor did he specify what he showed in AIDA was a 24/7 setup that was safe to run. If it is, he should reference where the clocking information came from so it can be properly analyzed for safe operation.
 
The quality of the memory sticks and chips also defines the ability to up-clock. Cheaper memory may, or, may not have the same PCB and chip design as name brand. That's not always the case.
 
 
 
 
None of this is going to create a 'miracle' Prepar3D change but it assures the highest resource performance available to the application. From there it comes down to the storage device speed and the video card, after that, the user settings for P3D including AA levels and the screen resolution... and last what addons and their features are doing to the render performance and what must be sacrificed to run all of the above.
 
 
 
 
Back to Top
NickN View Drop Down
Certified Professional
Certified Professional


Joined: November-21-2007
Points: 17272
Direct Link To This Post Posted: September-25-2018 at 2:17am
Originally posted by 777simmer 777simmer wrote:


 
(Now that I have inherreted my brothers old PC (HASWELL 4770k) I was going through your Haswell overclocking guide.
 
I can not comment on AMD and software other than AMD uses a different method of memory access and how it and memory latency are used. Therefore what you see in AIDA memory and cache benchmark is not the same as a Intel setup and can not be compared.
 
I would suggest delidding the 4770K. You get a lot more out of it that way.
Back to Top
AnkH View Drop Down
Intermediate Group
Intermediate Group


Joined: July-05-2013
Points: 84
Direct Link To This Post Posted: September-25-2018 at 4:17am
Originally posted by NickN NickN wrote:

Look at your current memory.. its not CL16-16-16-34 its CL16-18-18-38 with your current memory you are at further disadvantage as the sub-timing is 2 clicks higher. You should obtain CL14-14-14-34 and do not sacrifice sub timing. For 32GB that's this; https://www.newegg.com/Product/Product.aspx?Item=N82E16820232206
 
That's the modules I ordered now, although they were only available in white here in Switzerland. As soon as I have them (latest on the coming weekend), I will run another AIDA64 run and see if the numbers are better.
--------------------------------
Chris
Back to Top
 Post Reply Post Reply Page  123>
  Share Topic   

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.01
Copyright ©2001-2018 Web Wiz Ltd.

This page was generated in 0.234 seconds.