Autocad Civil 3d Line Types In R
Autodesk Trial Notice This Trial Notice describes a data collection and use program. We want you to have a useful, personalized, and engaging trial experience. We will use information we collect about your Autodesk product or service usage and website activity, and other information you might share with us, to help us decide what will be most relevant and interesting to you, and to learn more about how users like you work with our products and services. In return for providing you access to this Autodesk product or service, we may communicate with you by email, phone, in-product, and/or content we display on the website. We handle personal information in accordance with the Legal disclosures.
Autodesk Trial Notice This Trial Notice describes a data collection and use program. We want you to have a useful, personalized, and engaging trial experience. We will use information we collect about your Autodesk product or service usage and website activity, and other information you might share with us, to help us decide what will be most relevant and interesting to you, and to learn more about how users like you work with our products and services. In return for providing you access to this Autodesk product or service, we may communicate with you by email, in-product, and/or content we display on the website. We handle personal information in accordance with the Legal disclosures. Autodesk Trial Notice This Trial Notice describes a data collection and use program.
Back to Basics: Autocad Linetypes. SQOTD: K.M is doing a plat that has an electrical line type that has an “E” in the line. He was hoping to get the “E” to repeat more often and make the text larger so that it can be seen when plotted. How can we do this? Answer: To create a new line type the first step is to go.
We want you to have a useful, personalized, and engaging trial experience. We will use information we collect about your Autodesk product or service usage and website activity, and other information you might share with us, to help us decide what will be most relevant and interesting to you, and to learn more about how users like you work with our products and services. In return for providing you access to this Autodesk product or service, we may communicate with you by email, in-product, and/or content we display on the website. We handle personal information in accordance with the Legal disclosures.
GeForce >What’s Best for an AutoCad / Solidworks / Sketchup / Adobe CS Workstation? >Round 11,349 1.5.13 Mates, As I moved from 2D to 3D CAD in 2010, buying a used Dell Precision T5400 used 18 months for $500 [Xeon x5460 Quad Core @ 3.16GHz, 4GB RAM (upgraded to 12GB), Quadro FX 580, 1TB Barracuda, Vista Business 64 bit (upgraded to Windows 7 Ultimate 64)] seemed a near no-miss choice, (this computer cost over $8,000 new) but I was unprepared for the complex decision of which graphic card would be best suited for the applications I use >AutoCad 2007, Sketchup -now 8 Pro, SolidWorks 2010 x64, Corel Technical Designer X-5, and Adobe CS4. Quadros were and are almost universally praised for their 2D CAD capabilities and Autodesk and Solidworks have provided specialized drivers to optimize performance of their software using Quadros. However, it occurred to me that the lesser 3D performance of Quadros as compared to Geforce GTX should be considered as I work then and now more in Sketchup and other 3D applications- and with large files- 80 to 120MB. I am learning Revit, a 3D program with big files and a lot of rendering power needed. The more I learn about rendering, the more I see the need for a very high performance computer- CPU, GPU.
Mempry, and disk all have to be great. Because the graphics performance is so essential to fluent use of the applications I use, it seemed to me, one of the best ways to choose a graphics card is to visit the sites of the applications you intend to use and look into their recommendations for the most demanding version of their applications. Also, nvidia, which makes the chips and drivers for both Geforce and Quadro offers drivers that are “partnered” for specific use- you can get a specific Solidworks x64 2010 driver for example. Autodesk and I think ArchiCad, do this as well with Autodesk having tested cards with subsequent ‘recommendation” and “certification”. When I use the T5400 in AutoCad 2007 with a Geforce GTX 285, I had a periodic message, “This computer has non-certified hardware” or similar error message, no doubt referring to the GTX card.
The Autodesk application I think is the most demanding is the Product Design Suite Ultimate, a vast program which includes AutoCad, Mechanical Inventor Pro with simulation, 3ds Max, Mudbox, Electrical, and much more. I’ve read that Mudbox is quite demanding though Autodesk only mentions the need for Open GL support for that application.
Maya is another heavy-resource program lots of rendering, lots of polygons. You need 60GB HD space and something over $10,000. For 3D modeling the minimum system is amazingly, a Pentium 4, 3GHz and 4GB RAM, or 8Gb for large assemblies.
The recommended cards include ATI Firepros, but mainly Quadros. I was interested to see that Autodesk still recommends the old Quadro FX 580, the 512MB card I now use in my old Dell Optiplex 740 [AMD X2 6000+, 3GHz, 6GB, WD 750GB ] a card which I see on Ebay for as little as $30. The Quadro FX X8XX cards are all there >1800, 3800, 4800, and 5800 as well as the current 400, 600, 200, and 4000, but no Geforce. In less demanding applicati0ons such as AutoCad 2013- still having a lot of 3D capability- the old series Quadros such as FX 380, 570 ($25 eBay), 580 ($35), 1700, 3700, 3800, and etc. Are “certified” or “recommended”. Geforce 2- series GTX 260, 280, 285, 295 are listed, but not in the class. Useful guides for graphics cards may be found in Wikipedia under “Quadro” and “Comparison of nvidia Geforce”, listing the specifications of those lines of graphics cards.
[ See >] [See>] I was struck in the Quadro list by the specs of the FX 5800- 512 -bit high memory bandwidth of 159, 240 shaders, high clock rates, 4GB, and so on, all for only $2,700 or whatever- the 5800 was the top of the Quadros until the 6GB FX 6000- over $3,000. However, there is a note on this listing that the GPU and specification except memory ( the GTX 285 is 1GB, instead of 4) is shared with the Geforce GTX 285 and as I could buy a lightly-used GTX 285 on eBay for $140, that too seemed a no-miss choice. I also then believed that it was possible to soft-mod the GTX 285 into a Quadro 5800, but of course learned later that that trick was by then no longer allowed by nvidia. After installing Windows 7 Ultimate 64 bit and my applications, I installed the GTX 285 in the T5400 and put the Quadro FX 570 in my previous computer, Dell Dimension 8400 of 2004 (Pentium 4 630 @ 3.0GHZ, 3GB, ATI Radeon 9400, 750GB Seagate, XP Pro 64 -bit), which has among the first 64-bit CPU’s, the hyperthreading Prescott single core which read in Device Manager as two cores. Then I discovered Passmark Performance Test- and surprise and disappointment. The T5400 with the GTX 285 did well, a rating of 1852 - quite good- but the 2D score was only 300 couled with a very good 3D score as 2208. Strangely, The Dimension 8400- single core, with 1/3 the RAM has an overall mark of 452, but the 2D was 444!
In 2D, the 512MB Quadro FX 580 on a single core CPU was outperforming by nearly 50% the 1GB GTX 285 on a quad core Xeon computer that new cost about $9,000 in 2D! Given the relatively low 3D score of the FX 570, I learned then that Quadros of that era were indeed 2D specialists, but the much lower 2D score of the GTX on the T5400 was a mystery.
To make a long story- 25 hours!- of frustration short, I eventually learned that the Windows 7 Classic and Aero themes I’d tried were killers of 2D performance, at least on this T5400. Turning to the nasty baby blue Windows 7 Basic theme, the 2D score of the T5400 jumped from 300 to 583! - near enough to doubling and the 3D improved from 2208 to 2320. The overall rating of the 5400 also improved from 1852 to 2339.
I have never read of anyone else reporting the severe performance penalty of the Win 7 Classic and Aero themes, but there we are. This event made we wonder about all the other discreet performance hogs lurking in all the “helpful” fuzzy bears background program and I’ve become an obsessive Task Manager watcher (right click on the Task Bar) to see what the CPU and memory is up to at any given moment. As I’ve used the T5400 the last couple of years, I began to be dissatisfied with the performance in Sketchup, which I was using more and more in ever larger files. As the models became large, each time I changed the viewpoint, the wait to regen was frustrating.
I use Sketchup too casually- that is not very systematically, not taking full advantage of layers and components, and consequently, waiting to regen a view with shadows on a 100MB model seemed to take forever. I did learn that view regens depend on the amount of geometry that is visible, so I learned to navigate over the model in plan or around the edges and then zoom in to the position I wanted at the very last so that the least amount of 3D trees and other polygon rich objects were visible. It even helps to always save the drawing in a view with little geometry visible. Also, a big performance help is to add trees and any complex imported 3D models at the last minute when everything else is finished and still place them on a layer that can be turned off.
For general working, display in monochrome, and definitely, do not turn on shadows until you need to test views for rendering or 2D image export. When navigating, keep the model in constant motion artificially moving it about, or it will “freeze” and begin to fill in all the complex geometries. Last month (December, 2012), as I was planning a Solidworks assembly of 6,000 parts, I decided to try a higher level Quadro again.
Searching the specification charts, I was again immersed in the morass of Quadro precision and specialized application drivers vs. Geforce 3D speed and at much lower cost. Interestingly, the newer Quadros seemed to have changed their emphasis from 2D to 3D performance in accordance with the extreme shift, especially in architectural CAD to 3D applications like Revit. After some research, which showed the FX 4800 (384-bit, 1.5GB, 192 CUDA cores) producing stunningly good results in Solidworks and interestingly, this card was optimizied also for Adobe CS4, I found a relatively low hours one- about 15 months in a Precision T3500- on eBay for $150. The FX 4800 was expensive new- $1,300. The Quadro 4800 is beautifully made and very large. The Precision has a series of slots corresponding to the PCIe slots and the FX4800 has a rear bracket that supports the card on the back end.
The FX 4800 requires 2- 6 pin plugs for it’s 150W, still quite a bit less wattage than the GTX 285 at 204W. In the Passmark Performance Test, using the Quadro 4800, the overall rating for the T5400 was 1623 with 2D / 3D scores of 512 / 912, this compared to the 583 / 2208 of the T5400 with the Geforce GTX 285 and demonstrating the 3D emphasis of the Geforce. As I’m working on quite small AutoCad 2D, Solidworks, but large Sketchup 3D, I did not notice an improvement in 2D, however, Sketchup did seem to be a bit slower in zooms and pans and when turning on shadows- which may be my imagination as I thought the shadows are a cpu, rather than gpu task.
I’ve tried a number of different of drivers for the Quadro 4800, the one specifically for Solidworks 2010 and one for Adobe CS4, for which the FX 4800 was made with a special affinity. I had read that one of the principal advantages of Quadros over Geforce is the general focus on the precision of display, including aggressive anti-aliasing drivers, but even though I tried a specialized Solidworks driver with 32X anti-aliasing- the highest I’d ever seen, for some reason the display in Solidworks and AutoCad was not appreciably better.
In fact, an AutoCad 3D truss made of curved sections of round tubing seemed to have some intersection anomalies not present with the GTX. Sketchup has a maximum 4X anti-aliasing setting and even thought the Quadro control panel has a kind of “over-ride application settings”, I thought the Sketchup models looked exactly the same as regards anti-aliasing- that is poor and probably at the 4X application setting. I’m trying various rendering plug-ins for Sketchup and so far, the best came from the free Maxwell plug in. Rendering is entirely CPU based and as compared to the single threaded Sketchup and AutoCad, rendering is one application that can use all the CPU cores. In Task Manager, CPU usage on the quad core T5400 using Sketchup, a mainly single-threaded application as is Inventor and many others is never more than “25%” but using Maxwell- which allows you to set the number of cores to dedicate to rendering, the TM CPU usage is 100%'. This is a great feature of rendering programs as you can keep a couple of cores aside such that the rendering can churn away while you work on something else, or put the computer to work on the rendering and go another cup of coffee. I realize I haven’t spent enough time with different drivers and settings, and certainly not enough with big Solidworks assemblies and none with animations- which everyone says makes all the difference, so I’m reserving final judgement on the FX 4800.
The highest rated computer on the current Passmark Benchmark using a Quadro has a rating of 4970 and is called >“Xi M Tower PCIe Workstation” and uses an i7 3960X 6 core on ASUS Sabertooth x79, 16GB RAM, with a Quadro 4000 (2GB) and “LSI MR9240-4i” which is a RAID controller. The 2D/3D scores are >952 / 1981. Notice that the T5400 with GTX 285 produces 583 / 2320, much lower 2D, but higher 3D.
The Memory 2913 and Disk 5056 scores of the “Xi M Tower” are very high as compared to my T5400 of 646 / 956. It’s noteworthy that, overall, the Passmark benchmarks of similar configuration computers seem to rise sometimes dramatically, when they include SSD’s.
Now, for the interesting part >the highest 2D rated Computer on the current Passmark Benchmark uses a Geforce GTX 550Ti, i7 3770K @ 3.5GHz 4 core, 32GB RAM for an overall rating of 4744 and 2D/3D scores of 1087 / 2157. Note that the 3D score is slightly lower than the GTX 285 on the T5400 with a 1623 rating. No 2 highest 2D, has a rating of 4656, this on an i7 2600K @ 3.4, 10GB, a GTX 670, and this time the 2D / 3D is 1053 / 6089. Note the 2D score is similar but the 3D score is substantially higher than the No. 1 GTX 550Ti machine.
Perhaps the 1344 CUDA cores as compared to 240 of the GTX 285 affect this? The Highest Xeon / Quadro 2D score >has a rating of Xeon E3-1270@ 3.5GHz 4 core and, amazingly, a 1GB Quadro 600, a $150 card, to score 2D / 3D of 818/704. It’s interesting, but if you are using 2D, some of the lower end and older Quadros like the 512MB FX580- $40 on eBay- seem to produce really strong results- in 2D, but like the current Quadro 400 and 600, the 3D score will be low. The highest 3D rating machine is rated overall at 4523, using an i5 2500K @ 3.3, 4Core, 16GB, and a GTX680 producing a 2D /3D of 855 / 6598. The memory rating is very high >3008 with a disk score of 1852. 2 3D machine is an i5 2600K @ 3.4, 4Core, 32GB, and again, a GTX 680 for a 2D /3D of 925 / 6346, a slightly better 2D than the No 1 3D configuration. Interestingly, the top two 3D machines use quad core i5's.
The highest rated Computer on the current Passmark Benchmark using a GeForce has a rating of 5622 and uses an i7 990 6 core @ 3.47, 12GB RAM, a GTX 580 and produces 2D / 3D scores of 911/5501. The GTX 580 is interesting as it has a 384-bit memory bus width 512 CUDA cores, and it seems to me, that the computers with high graphics scores seem to favor GPUs with the wider -384 and 512-bit memory bus widths. I find the balance of both a high 2D and 3D in this configuration very attractive. All the GTX 5-series cards seem to make a good 2D / 3D balance. By the way, the GTX 580 takes a lot of power- 244W as compared to the already high 204W of the GTX 285 and 150W of the Quadro 4800 and the GTX 690 requires 300W.
A Quadro 600 takes only 40W. The highest 3D rating using Quadro has an overall of 4523, that on Xeon 2X 2687W @ 3.1GHZ, 8 core, a Quadro K5000 (4GB, PCIe 3.0) 65GB RAM, and producing a 2D / 3D score of 597 / 4134. Interestingly, the 2D is not very impressive for this computer (e.g. The T5400 with GTX 285 produced 583 in 2D) and this probably very expensive machine uses 2X 8 core Xeons (= $3,800 in processors alone!), 65GB of RAM, and a $1,700 Quadro K5000 (4GB).
This may reflect the fact that most CAD applications are single threaded such that the processor clock speed is more critical than the 16 cores. The 3D score though shows how Quadros - and I saw this many times on Passmark scores- that Quadro are shifting to an emphasis on 3D performance. On the other hand, as rendering can use every core, I imagine this computer would be great at that! Note that the single i7 990 Geforce with a GTX 580 surpasses this one in both 2D and 3D. The Disk score of 7023, one of the highest I ever saw, also suggests some kind of enterprise card drive, no doubt a pricey item as well.
Was this one perhaps optimized for video editing? Summary: It’s worth noting the close inter-relationship of CPU, GPU, memory and disk performance and system synergy- good Cad/Graphics solutions will not be found with a hot rod graphics card alone. Most CAD and graphics applications -except rendering- are still mainly single threaded, so CPU clock speed is critical. If you are doing large renderings, use a CPU with the highest clock speed and as many cores as is reasonable. As the current Quadros seem to have begun to shift performance towards 3D, for the best balance of 2D / 3D CAD and graphics performance, it seems to me a recent Geforce using GDDR5 is difficult to beat, and look for the cards with wider memory bus width -384, 448, and 512 bit are better and memory bandwidth helps as well- the Quadro FX 4800 has 78, the GTX 285 is 159, the GTX 580 and 690 both about 192.
Note that the 1GB 512-bit GTX 285 (used $100) has a memory bandwidth -not too far off the more modern 4GB- and 256-bit 690 ($1,000). It’s entirely possible that the specialized drivers for applications such as Solidworks may offer the serious large assembly maker advantages in precision and anti-aliasing display, but I have yet to see this for myself. I should also mention that I have had very good luck with used graphics cards, possibly because people seem to experiment and upgrade often, so if you’re on a budget, or want to experiment, ir seems possible to buy used, find the right direction by multiple tries, sell the experiments for at or near the purchase price- making the experiments almost free, and then buy new. This is in accordance with one of my favourite adages, “Measure twice and saw once.” Thanks for getting this far in such a very long post. I hope this help someone avoid my time-consuming self-torture as regards finding a good CAD / graphics applications graphics card and/or workstation solution.
Cheers, BambiBoom ____________________________________________________________________________ PS>Based on Passmark results for overall rating, CPU, 2D, 3D and memory, and disk performance, here’s a quick specification for >“BambiBoom’s Reasonably Priced (well $2,700 @ Newegg) Hot Rod CAD / Graphics Workstation” >which shouldn’t be too shabby for games either >>>Intel Core i7-3930K Sandy Bridge-E 3.2GHz (3.8GHz Turbo) LGA 2011 130W Six-Core Desktop Processor BX0K $569.99 >Xeons are fantastically accurate and stable but locked and very expensive. This i7 appears to be a good overclocker and poking around the overclocking world, appears to be very stable at say 4.2 and even reliable at 4.4GHz. See related liquid cooling listing below!
>ASUS Sabertooth X79 LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 ATX Intel Motherboard $339.99 - several of the very high Passmark benchmark computers use this particular board >G.SKILL Ripjaws Z Series 32GB (4 x 8GB) 240-Pin DDR3 SDRAM DDR3 1866 (PC3 14900) Desktop Memory Model F3-14900CL10Q-32GBZL $179.99 >The ASUS Sabertooth can use 64GB and that’s not a bad idea, especially as RAM is so cheap now. When I had my 1993 IBM 486 ($1,900) running Windows 3.1 over DOS 6 at 50MHz!, 2MBs -that’s MB’s not GB’s- of RAM cost $180!
As I’ve become more fluent with 3D, I seem to end up to often running simultaneously AutoCad 2007, Sketchup 8 Pro, Corel Technical Designer X-5, Photoshop CS4 and Mozilla Firefox, and these with everything else going- OS, backup, security, and etc. Can add up to about 10GB of my 12. AutoCad, which I use mainly in 2D is not too resource hungry, but Solidworks and Sketchup occasionally take 2GB each, though Sketchup typically runs in about 850K-1.4GB. Some rendering programs I’m test driving appear in Task Manager as using all 4 cores and 2GB.
By the way, the old T5400,having a dual CPU server motherboard (sim.Poweredge 2950) can use 192GB RAM (8 x 16GB)! >EVGA 03G-P3-1594-KR GeForce GTX 580 (Fermi) Classified 3GB 384-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card $499.99 >or about $250-300 as an eBay “experiment” >See text above for the reasons for this choice. >Kingston HyperX 3K SH103S3/120G 2.5' 120GB SATA III MLC Internal Solid State Drive (SSD) (Stand-Alone Drive) $101.99 >For OS and Applications. SSD’s seem to be fast, fast, fast, but based on reading dozens of feedbacks, are also too often quirky to install, unreliable, and short-lived. I don’t trust them! My thought is to use an SSD for OS and programs for speed, but keep all the data safely on enterprise version mechanical drives mirrored in RAID. I'd keep a full system image backup on the mechanical drives at all times, ready to go as well!
>2x Seagate Constellation ES ST1000NM0001 1TB 7200 RPM SAS 6Gb/s 3.5' Internal Enterprise Hard Drive -Bare Drive $299.98 ($149.99 each) >For DATA in RAID mirroring >LIAN LI PC-V750WX Black Aluminum ATX Full Tower Computer Case $379.99 >Relatively expensive, but I like very plain, solid cases, roomy, with good cooling/ venting and this one has convenient USB 2 and 3 ports on the front. >CORSAIR HX Series HX850 850W ATX12V 2.3 / EPS12V 2.91 SLI Ready CrossFire Ready 80 PLUS GOLD Certified Modular Active PFC Power Supply $169.99 >As it’s possible to add another or even two more 240W GPU’s to this configuration, I would strongly consider a 100W PS. The T5400 has an 875W Ps for comparison. >CORSAIR Hydro Series H60 (CWCH60) High Performance Liquid CPU Cooler $64.99 >This is not a highly researched choice, but mainly a note to try and make like easier for the overclocked 3930K. Siddharthmukul007, As I'd hoped to relate in my original post, I'd long thought Quadros were unbeatable in every kinds of graphics applications, as you seem to do, but my subsequent comparisons of Quadro (3) and Geforce (1) cards in my use and in analysis of specifications and benchmarks of many, many others tells a very complex- hence the length of this post- and different story, mainly that graphics applications are consistently shifting to an emphasis on 3D performance and there are many other systems factors affecting the total experience.
You write, ' nothing from the consumer graphics beats the qudro 4000 and above in dcc' but I'd appreciate seeing hard evidence or good descriptions of your experience of that statement as I saw many Geforce GTX cards- even some 8800GT's that surpassed the Quadro 4000 in 2D and especially 3D benchmarks. That said, I'm still open to the possibility that I haven't optimized my FX 4800- which by the way surpassed some Quadro 4000's in 2D benchmarks, and I may well find when doing my 6,000 part Solidworks assembly that Quadro's are 'unbeatable.' So far, the numbers say otherwise. Thanks you for your reply. Cheers, BambiBoom.
There is something seriously rotten/wrong/odd about the workstation card market. For instance, supposedly high-powered $1000 Quadro 4000 has some ridiculously underwhelming specs (625 MHz core for starters - and this is a normal Fermi core, one generation OLDER than the Kepler core which the GTX 680 runs at 1100 MHz!). All nVidias cards use the same chips, but as far as I can tell nVidia has *severely* crippled the consumer line GTX cards in order to sell Quadro cards, but the Quadros are _also_ crippled in order to sell Tesla cards. NVidia's wettest dream, embodied in the Maximus drivers, is that you buy a Quadro to get decent display speed, and then a Tesla to get decent computational speed.
Even with this crippling it seems nVidia is extremely unwilling to let Quadros face off against Geforce cards. I can see no other reason that there isn't _a single benchmark on the entire net_ which compares Quadros to Geforces to Radeons to Fire Pros on computational speed (and I'm not talking 3DMark score or framerate in Crysis 3 here, but single and double precision number crunching through CUDA and OpenCL). My impression is that when you buy a Quadro you pay for the driver and the firmware - the Quadro cards are clearly very weak, but still probably manage to outperform the *far* more capable Geforce hardware with intentionally crippled firmware and drivers. This seems to be what all the 'Quadros suck at games' and 'get a Quadro for graphics work' is really about: the Quadros are slow as heck, but their maths capabilities less crippled than the faster Geforce's. NVidia is abusing the market because it has no serious competition. Lol dont compare fermi clock to kepler based card clock.
Have you seen gtx580 clocked at 1ghz plus run on air? You will be going for 3dmark world record if you can overclock 580 above 1ghz. Also this so called pro cards were clocked lower than the geforce counter part to ensure stability. Performance is important but in professional world they can't compromise on accuracy and precision. About geforce vs pro it is open for debate. In the past all you need to do is flash the bios and your gtx will change into quadro.
Some people don't like it being forced to buy pro cards because the gaming card was crippled for pro application but the fact is R&D isn't cheap. Siddharthmukul007, As I'd hoped to relate in my original post, I'd long thought Quadros were unbeatable in every kinds of graphics applications, as you seem to do, but my subsequent comparisons of Quadro (3) and Geforce (1) cards in my use and in analysis of specifications and benchmarks of many, many others tells a very complex- hence the length of this post- and different story, mainly that graphics applications are consistently shifting to an emphasis on 3D performance and there are many other systems factors affecting the total experience. You write, ' nothing from the consumer graphics beats the qudro 4000 and above in dcc' but I'd appreciate seeing hard evidence or good descriptions of your experience of that statement as I saw many Geforce GTX cards- even some 8800GT's that surpassed the Quadro 4000 in 2D and especially 3D benchmarks. That said, I'm still open to the possibility that I haven't optimized my FX 4800- which by the way surpassed some Quadro 4000's in 2D benchmarks, and I may well find when doing my 6,000 part Solidworks assembly that Quadro's are 'unbeatable.' So far, the numbers say otherwise.
Thanks you for your reply. Cheers, BambiBoom It's not just in benchmarks where you see that the issue of relative performance is more complicated than Nvidia or Autodesk would have you believe. My personal, real experiences (as well as many others) using the software have so far shown me no reason to believe Quadros are more effective than GTX at many of the things they claim to be better. - Viewport/Model Space performance AutoCAD (both 2D and 3D) and Revit: This is something that is advertised as a 'workstation' card specialty - i.e., zooming, panning, rotating model elements is supposed to be smoother and faster with the 'certified' card than the 'gaming' card. I have found this to not be true with many of the files I work with. Model element manipulation that causes my GTX-570 to bog down, is not one whit faster or smoother with the likes of a Quadro 5000 (maybe a 6000 is different, but I don't have $3-4,000 lying around to test that theory, and I don't know anyone with a Quadro 6000 that I can borrow).
I have compared the same files with a FirePro V5900 I have handy - same result - not any better, or at least such a small amount that it is not detectable. Stuff like more precise anti-aliasing shows minimal if any performance gain. This is just my experience, true - but I associate with many CAD users both in the professional and academic realm - the majority of users (without any prompting) corroborate my findings on a wide range of file types. For you GPGPU folks - it's true that with Kepler, Nvidia effectively hamstrung the GTX's abilities on that front, but the Fermi's have been shown to be faster with gpu-acceleration, owing to their greater amount of CUDA cores; if I am paying much more for the pro cards, I expect them to not have less cores than the gaming versions.
Like I said, it appears that Nvidia has 'rectified' this with the Keplers. GTX-6xx and the Kxxx Quadros seem to have equivalent core counts at the different tiers, and the GTX's cores have been crippled with respect to GPU acceleration performance.
Why are the Geforce (Kepler) cards so *horribly* crippled? Because otherwise there'd be no reason to buy Quadros. Why is the processing power of the quadros _also-but-somewhat-less_ crippled? Because otherwise there'd be no reason to buy Teslas. Why don't Teslas have graphics ports? Because if they did there'd be no reason to buy Quadros. It's the same hardware, crippled in different ways.
This is done to segment the market so nvidia can segment the market and pretend it's three different product lines when in fact they sell the same hardware to budget-conscious gamers as to graphics pros and research institutes. What nVidia is doing is not illegal, but it sure is ugly. GeForce >What’s Best for an AutoCad / Solidworks / Sketchup / Adobe CS Workstation? >Round 11,349 2.4.13 Mates, While the debate between Quadro vs. Geforce continues, I found a brief but informative post on this subject that nicely clarifies the role in terms of performance of the various hardware and software components, reinforcing the idea that better graphics performance is dependent on good choices for CPU- for example that rendering can use as many cores as may be assigned- as well as GPU, the right drivers- the difference between detail vs. Frame rate oriented drivers, video memory, and so on. RE: Quadro vs.
GeForce >What’s Best for an AutoCad / Solidworks / Sketchup / Adobe CS Workstation? Update >2.14.13 Mates, The following article, 'Nvidia Quadro K5000 Professional Graphics Card Review', 1/16/2013 08:24 AM Graphics] by Ilya Gavrichenkov and posted on the X-Bit Laboratories site >>is one of the better articles I've read that describe and quantify the differences in performance priorities of nvidia Quadro professional workstation cards in comparison to game-oriented Geforce cards. While this review concentrates on a comparison of the Quadro 5000 to it's successor K5000, X-Bit compares the performance of the recently released Quadro F5000, the Kepler GK104 GPU based, 256-bit, 4GB K5000, to it's predecessor the Fermi GF100-based 320 bit, 2.5GB Quadro 5000, and in the gamers' corner, the Geforce GTX 680, which uses the same GK104 GPU as the K5000, and of which a pair is used in the GTX 690. I find the comparison using the K5000 and GTX 680 informative as it demonstrates how the same GPU and general architecture can prduce such different results by changing clock rates and drivers. Also, performance was seriously affected by anti-aliasing settings- something 3D CAD users obsess over the way gamers are in a benchmark frenzy about frame rates- and the GTX 680 with high anti-aliasing settings was a poor performer by a wide margin. The following paragraph from this article contains a definitive summary statement >>'Meanwhile, SPECViewperf makes it clear that using gaming graphics cards for professional applications isn’t a good idea. Having the same architecture as the Quadro K5000 and even higher clock rates, the GeForce GTX 680 has huge problems handling complex OpenGL-based models.
Its speed is much lower compared to the specialized solutions with optimized drivers.' >'Autodesk AutoCAD 2013:Here’s another popular 3D modeling suite that uses DirectX.
Scribblenauts Unlimited Download Mac Ita Brush. As with 3ds Max, Nvidia offers special support for AutoCAD developers but, unfortunately, the corresponding mini-driver for the Quadro series hasn’t been updated for a long time. Its latest version is only compatible with AutoCAD 2011. That’s why users of this software suite cannot fully benefit from the advantages offered by Nvidia’s latest professional cards.
It means that there is no fundamental difference between the Quadro and GeForce series products based on the Kepler architecture.' >'Gaming is different from professional applications, as you can see. The GeForce GTX 680 is much faster than the Quadro K5000, although both are based on the same architecture and even use the same GPU. The GeForce GTX 680 just has much higher clock rates, which cannot but tell on its gaming performance. We didn't see that in the professional applications. Because of application-specific optimizations which can improve a graphics card's performance in CAD applications irrespective of its hardware capabilities.'
In my view, this discussion is going to take a serious shift in direction- soon -as a string of parallel GPU processing units like the Tesla K20 can create astounding personal supercomputers. Have a look at how many TFLOPS you can get at home already for $10,000. CPU's will someday be called something like 'GPUC's' or 'Graphic Processing Unit Controllers'.
Model a Century of oceanic current dynamics for the high school science fair!, DIY home protein folding!, NAMD molecular dynamically optimized cake recipes! And gamers, you're not left out. Some bright computing futurist mentioned that if the code were optimized for it, as of today, on 4X Tesla K20's, Crysis 3 could run at more than 1000 fps at full settings. Ancient computer history to make Xeon and Quadro purchasers feel they're getting a bargain: Note. These figures are 'MB's' and not 'GB's' >The 2MB RAM upgrade to my 1993 IBM 50MHz 486 ($2,300 including 14' monitor) cost $180, and maxed the system RAM at 4MB. The 85MB HD contained >DOS 6 (OS), Windows 3.1 (GI), AutoCad 10 DOS, Wordperfect 6- the first graphical interface version, Corel 3 Graphics Suite, and all the files I'd ever done on a computer for 6 months. AutoCad cost $480.
And came- I think on 5- 1.44MB 3.5' floppies. A large.*DWG was 100K. A 540MB HD- the OS could only see a maximum of 528MB cost $550- about $1 per MB was the ratio!
Thanks bambiboom for your detailed post. It was extremely long but had some great information on it. I am trying to decide on a card while I let my integrated graphics card in my Asus MAXIMUS V formula board do the work for now; it is surprisingly not that painful.
I know I should have gone with a 2011 socket, but I realized that too late (should have done a few more days of research). Anyhow, I am trying to decide between a 670GTX, a K600, & the K2000 myself. I am more of a hobbyist, but when I do delve into these applications, I do want them to move fast. I do some very simple and light video editing with very high resolution files shot at 1440p X 60 ffps (Go pro black edition), some photoshop work that is becoming more and more complex as I learn more, some autocad work on the side in 2D using ACAD 2013, and am now playing with sketchup pro while designing my home. So my 2D is more demanding than my 3D tasks, but that may begin to change in the near future. My budget is $400-$450, though I would rather pay less if there is not that huge of a performance difference.
I wonder if nVdia has worked on their autocad 2013 drivers so they would be optimized for the quadros, because if they have not, I think the 670 GTX with 2 GB of DDR5 memory would have to be the choice over the K2000. When I work on autocad, it has to be fast. However, I am not sure if the quadros would outperfom the 670 GTX in photoshop & if the advantage I gain in autocad would be trumped by the disadvantage in photoshop.
I assume the 670 GTX would perform better in Sketchup Pro, but perhaps the optimized drivers (something I have to learn more about. Do you really have to install a different driver with each program and constantly change the driver you are using?) for the quadros would allow the to outperform the 600 series GTX line.
Any addition information offered to help me make my choice, including some other suggestions, comments or even criticisms are welcomed and would be greatly appreciated. Thanks bambiboom for your detailed post. It was extremely long but had some great information on it. I am trying to decide on a card while I let my integrated graphics card in my Asus MAXIMUS V formula board do the work for now; it is surprisingly not that painful. I know I should have gone with a 2011 socket, but I realized that too late (should have done a few more days of research).
Anyhow, I am trying to decide between a 670GTX, a K600, & the K2000 myself. I am more of a hobbyist, but when I do delve into these applications, I do want them to move fast. I do some very simple and light video editing with very high resolution files shot at 1440p X 60 ffps (Go pro black edition), some photoshop work that is becoming more and more complex as I learn more, some autocad work on the side in 2D using ACAD 2013, and am now playing with sketchup pro while designing my home. So my 2D is more demanding than my 3D tasks, but that may begin to change in the near future. My budget is $400-$450, though I would rather pay less if there is not that huge of a performance difference. I wonder if nVdia has worked on their autocad 2013 drivers so they would be optimized for the quadros, because if they have not, I think the 670 GTX with 2 GB of DDR5 memory would have to be the choice over the K2000. When I work on autocad, it has to be fast.
However, I am not sure if the quadros would outperfom the 670 GTX in photoshop & if the advantage I gain in autocad would be trumped by the disadvantage in photoshop. I assume the 670 GTX would perform better in Sketchup Pro, but perhaps the optimized drivers (something I have to learn more about. Do you really have to install a different driver with each program and constantly change the driver you are using?) for the quadros would allow the to outperform the 600 series GTX line. Any addition information offered to help me make my choice, including some other suggestions, comments or even criticisms are welcomed and would be greatly appreciated. Esumsea, The Quadro vs. GeForce debate is always a difficult question of degree. Up to a certain point of complexity and quality, a GTX may work very well, but if the quality expectations are very high and the applications fall into the CUDA accelerated category and/or have 'partnered' drivers to optimize performance, the Quadro will have substantial advantages.
My experiment with a GTX 285 was disappointing >would not open viewports, bizarre 2D views of 3D models, limited anti-aliasing, artifacts (the dreaded 'rain of lines syndrome'), corrupted shadows and Sketchup renderings crashed. This was all solved by replacing a $350 GTX with a $1,200 Quadro- which I bought used for $150.
The Passmark rating reduced from 1909 to 1859, and with lesser 3D benchmark performance, but my 'slower' system saved many, many hours of crash recovery and rendering restarts. I am a very reluctant computer technician- if I'd never looked inside one, I'd be a happier person and the neighbours would have heard less screaming. Thanks to Windows 95, I learnt how to swear in Hungarian.
I As I perform more complex and larger tasks, the more I'm convinced that I will probably own a Xeon / ECC RAM / Quadro or equivalent computer the rest of my life. So, if you are using Autodesk and Adobe applications, both have a symbiotic optimization with Quadros. The FX 4800 I use was especially optimized for Adobe CS4 and there are partnered drivers for Solidworks 2010.
The Quadro drivers and emphasis on image quality rather than gaming frame rates is the key to their value and cost- imagine 128X instead of 16X anti-aliaising. Whether you would be unhappy with the GTX quailties and/or will take advantage of the Quadros in your work is difficult to say. You might get along very well with the GTX 670 which has generally very good 2D and excellent 3D performance, but if you are or become highly particular with image quality, as opposed to speed of operation, I'd say use a Quadro. The K600 is attractive, and very good in 2D, but is not stellar in 3D- it may make you unhappy in Sketchup. From a cost/performance standpoint, for a new card, the K2000D would be my choice. On Passmark, the highest K2000, 2D score is 911 and 3D= 1762, while for the K600, 2D=883 and 3D=875. This leads me to another option to consider, which would be to buy a used Quadro 4000.
Gdisk Wipe Download. These are common and have been around long enough that I see many in the $300-350 range and highest 2D=1103 and 3D=2924. Be aware that the Quadro 4000 is known to run hot and when buying used, find out what kind of use and any history of possible overheating. If you stay on the GTX side, you might consider instead of the GTX 670, using a GTX 580 in the 3GB form- 384-bit instead of 256, same 192 bandwidth, and 1GB more memory. These are not cheap- still $350 used- and have less CUDA cores (512), but have a reputation as being excellent video editing cards- really fantastic 2D function as well. The price will continue to drop and if you have the power capacity, you might add a second one in SLI- have 6GB memory- and edit feature films.
For discussion purposes, if I were building a long term use, generous budget system that would be performing tasks very similar to yours with a emphasis on 3D modeling, 2D effects, processing, rendering speed, and image quality, it would be >The BambiBoom Pixeldozer Espresso TurboKlonk 3000 CAD / Imaging / Rendering / Editing Workstation ® © ™ ℞ _5.31.13 1.Xeon E5-1650 6-core 3.2 /3.8GHz $600 2.ASUS P9X79 WS LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 SSI CEB $380 3.Kingston 32B (4X 8GB) 240-Pin DDR3 SDRAM DDR3 1600 ECC Unbuffered Server Memory w/TS Intel Model KVR16E11/8I $300. 4.NVIDIA Quadro K4000 3GB GDDR5 PCI Express 2.0 x16 Workstation Video Card $800 5.SAMSUNG 840 Pro Series MZ-7PD256BW 2.5' 256GB SATA III MLC Internal Solid State Drive (SSD) $250 6.2X WESTERN DIGITAL 1TB HARD DRIVE SATA 64MB 6 Gb/s WD AV-GP $170 7.LIAN LI PC-A75 Black Aluminum ATX Full Tower Computer Case $182 8.SeaSonic X750 Gold 750W ATX12V V2.3/EPS 12V V2.91 SLI Ready 80 PLUS GOLD Certified Full Modular Active PFC Power Supply $150 9.Microsoft Windows 7 Ultimate SP1 64-bit - OEM $190 10.Noctua NH-D14 120mm & 140mm SSO CPU Cooler $84.
11.ASUS Black Blu-ray Burner SATA BW-12B1ST/BLK/G/AS $85 TOTAL >about $3,200 I apologize for rambling on, for gamers the GPU questions can be more easily summarized in frame rates at whatever setting- a simple quantity, but for content creation, the issue of quality and stability against speed- and higher cost is subjective and becomes a complex equation. Cheers, BambiBoom. Thanks bambiboom for your detailed post. It was extremely long but had some great information on it. I am trying to decide on a card while I let my integrated graphics card in my Asus MAXIMUS V formula board do the work for now; it is surprisingly not that painful. I know I should have gone with a 2011 socket, but I realized that too late (should have done a few more days of research). Anyhow, I am trying to decide between a 670GTX, a K600, & the K2000 myself.
I am more of a hobbyist, but when I do delve into these applications, I do want them to move fast. I do some very simple and light video editing with very high resolution files shot at 1440p X 60 ffps (Go pro black edition), some photoshop work that is becoming more and more complex as I learn more, some autocad work on the side in 2D using ACAD 2013, and am now playing with sketchup pro while designing my home. So my 2D is more demanding than my 3D tasks, but that may begin to change in the near future. My budget is $400-$450, though I would rather pay less if there is not that huge of a performance difference. I wonder if nVdia has worked on their autocad 2013 drivers so they would be optimized for the quadros, because if they have not, I think the 670 GTX with 2 GB of DDR5 memory would have to be the choice over the K2000.
When I work on autocad, it has to be fast. However, I am not sure if the quadros would outperfom the 670 GTX in photoshop & if the advantage I gain in autocad would be trumped by the disadvantage in photoshop.
I assume the 670 GTX would perform better in Sketchup Pro, but perhaps the optimized drivers (something I have to learn more about. Do you really have to install a different driver with each program and constantly change the driver you are using?) for the quadros would allow the to outperform the 600 series GTX line. Any addition information offered to help me make my choice, including some other suggestions, comments or even criticisms are welcomed and would be greatly appreciated. Esumsea, The Quadro vs. GeForce debate is always a difficult question of degree. Up to a certain point of complexity and quality, a GTX may work very well, but if the quality expectations are very high and the applications fall into the CUDA accelerated category and/or have 'partnered' drivers to optimize performance, the Quadro will have substantial advantages. My experiment with a GTX 285 was disappointing >would not open viewports, bizarre 2D views of 3D models, limited anti-aliasing, artifacts (the dreaded 'rain of lines syndrome'), corrupted shadows and Sketchup renderings crashed.
This was all solved by replacing a $350 GTX with a $1,200 Quadro- which I bought used for $150. The Passmark rating reduced from 1909 to 1859, and with lesser 3D benchmark performance, but my 'slower' system saved many, many hours of crash recovery and rendering restarts. I am a very reluctant computer technician- if I'd never looked inside one, I'd be a happier person and the neighbours would have heard less screaming. Thanks to Windows 95, I learnt how to swear in Hungarian. I As I perform more complex and larger tasks, the more I'm convinced that I will probably own a Xeon / ECC RAM / Quadro or equivalent computer the rest of my life. So, if you are using Autodesk and Adobe applications, both have a symbiotic optimization with Quadros. The FX 4800 I use was especially optimized for Adobe CS4 and there are partnered drivers for Solidworks 2010.
The Quadro drivers and emphasis on image quality rather than gaming frame rates is the key to their value and cost- imagine 128X instead of 16X anti-aliaising. Whether you would be unhappy with the GTX quailties and/or will take advantage of the Quadros in your work is difficult to say.
You might get along very well with the GTX 670 which has generally very good 2D and excellent 3D performance, but if you are or become highly particular with image quality, as opposed to speed of operation, I'd say use a Quadro. The K600 is attractive, and very good in 2D, but is not stellar in 3D- it may make you unhappy in Sketchup. From a cost/performance standpoint, for a new card, the K2000D would be my choice. On Passmark, the highest K2000, 2D score is 911 and 3D= 1762, while for the K600, 2D=883 and 3D=875. This leads me to another option to consider, which would be to buy a used Quadro 4000.
These are common and have been around long enough that I see many in the $300-350 range and highest 2D=1103 and 3D=2924. Be aware that the Quadro 4000 is known to run hot and when buying used, find out what kind of use and any history of possible overheating. If you stay on the GTX side, you might consider instead of the GTX 670, using a GTX 580 in the 3GB form- 384-bit instead of 256, same 192 bandwidth, and 1GB more memory.
These are not cheap- still $350 used- and have less CUDA cores (512), but have a reputation as being excellent video editing cards- really fantastic 2D function as well. The price will continue to drop and if you have the power capacity, you might add a second one in SLI- have 6GB memory- and edit feature films. For discussion purposes, if I were building a long term use, generous budget system that would be performing tasks very similar to yours with a emphasis on 3D modeling, 2D effects, processing, rendering speed, and image quality, it would be >The BambiBoom Pixeldozer Espresso TurboKlonk 3000 CAD / Imaging / Rendering / Editing Workstation ® © ™ ℞ _5.31.13 1.Xeon E5-1650 6-core 3.2 /3.8GHz $600 2.ASUS P9X79 WS LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 SSI CEB $380 3.Kingston 32B (4X 8GB) 240-Pin DDR3 SDRAM DDR3 1600 ECC Unbuffered Server Memory w/TS Intel Model KVR16E11/8I $300. 4.NVIDIA Quadro K4000 3GB GDDR5 PCI Express 2.0 x16 Workstation Video Card $800 5.SAMSUNG 840 Pro Series MZ-7PD256BW 2.5' 256GB SATA III MLC Internal Solid State Drive (SSD) $250 6.2X WESTERN DIGITAL 1TB HARD DRIVE SATA 64MB 6 Gb/s WD AV-GP $170 7.LIAN LI PC-A75 Black Aluminum ATX Full Tower Computer Case $182 8.SeaSonic X750 Gold 750W ATX12V V2.3/EPS 12V V2.91 SLI Ready 80 PLUS GOLD Certified Full Modular Active PFC Power Supply $150 9.Microsoft Windows 7 Ultimate SP1 64-bit - OEM $190 10.Noctua NH-D14 120mm & 140mm SSO CPU Cooler $84. 11.ASUS Black Blu-ray Burner SATA BW-12B1ST/BLK/G/AS $85 TOTAL >about $3,200 I apologize for rambling on, for gamers the GPU questions can be more easily summarized in frame rates at whatever setting- a simple quantity, but for content creation, the issue of quality and stability against speed- and higher cost is subjective and becomes a complex equation. Cheers, BambiBoom Hi all, sorry to sorta hijack your thread a bit, but I figured 'why not ask a community with the experience and know-how for some help?'
In short, I've heard a lot of bad stories about using the Quadro 4000 when doing 2d primarily (2007).even up to a point where say an 8800GT outperforms it. Looking at the numerous other options, would you say that going for a GTX card would be better in this instance? Or maybe even an AMD card?
Thanks in advance! Hi all, sorry to sorta hijack your thread a bit, but I figured 'why not ask a community with the experience and know-how for some help?' In short, I've heard a lot of bad stories about using the Quadro 4000 when doing 2d primarily (2007).even up to a point where say an 8800GT outperforms it. Looking at the numerous other options, would you say that going for a GTX card would be better in this instance? Or maybe even an AMD card?
Thanks in advance! Dylan Dylan, Of course, I haven't used every graphics card so I rely on benchmarks baselines like Passmark to have a comparative sense of what works better than others. One of the very revealing features of Passmark is that any single GPU produces a very wide range of 2D and 3D scores.
The Quadro 4000 is a prime example >the top 2D score is 1103 and bottom is 304, top 3D =2924, lowest is 1128. The 8800 GT 2D is 952 >242 and 3D goes from 923 to 352. So, given the right system components, it appears some GT 8800's could outperform some Quadro 4000's in 2D, while no 8800 GT will outperform a Quadro 4000 in 3D.
Some Quadro 4000 results are anomalous because they are finding their way into systems like Dell Precision 390's with 2GHz core2 Duo's. There's a reason why an older card like the 8800 GT will hold it's own in 2D against a modern one. Graphics cards in general for years have been shifting from a 2D emphasis to 3D and Quadros in particular. For GeForce / Radeon this is because of the extreme demands of 3D games, and for Quadro / Firepro, the rise of 3D CAD. If someone is working exclusively in 2D image creation, expect the best possible image quality, and doesn't want to spend a pile on a graphics card, the best approach is to find a three generation old Quadro, from that far off time when 2D was king. A Quadro FX 1800 (768MB) on an i7-3930K / ASUS Sabertooth X79 / 65GB RAM /SSD system produced a 2D score of 1037 - but only 623 in 3D.
Whoever built that system using a $570 CPU, a $350 motherboard, and interestingly over $500 of RAM has to be a Photoshop / Illustrator / rendering fiend- and also knew exactly what graphics card to buy. And, the good news is that a Quadro FX 1800 is a $50-80 card. To put that in perspective, a $1,000 GTX Titan shows Passmark 2D scores between 1215 and 630, but only 4 of 106 Titans scored higher in 2D than the top Quadro FX 1800. Who said Quadros are overpriced- that's a $950 savings! While benchmarks are a good way to start graphics card conversations, the important difference between Quadro and GT/GTX are the drivers and the consequential quality advantages- for example 128X anti-aliasing instead of 16X, more refined shadows, particle / fluids/ reflections, color gradients. Even though the two lines use the same GPU- the 8800GT and Quadro FX 3700 share the G92, the Quadros focus on image refinement and subtlety and GTX are configured to be frame rate oriented. [By the way, I think (not absolutely sure) it was up until the G92 that a GeForce could be soft-modded (RivaTuner) into believing it was a Quadro so the Quadro drivers could be used.
NVIDIA caught on and to protect their driver development investment stopped the soft-mod possibilities] A GTX is made to move on to the next frame where a Quadro will sit back and complete all the shadows and gradients. The quality difference can't be described in numbers, but professionals deeply involved in imaging content creation seem inevitably to end up with Quadros. Cheers, BambiBoom [Dell Precision T5400 >2X Xeon X5460 quad core @ 3.16GHz, 16 GB ECC, Quadro FX 4800 (1.5GB)[2D=517, 3D=1097], WD RE4 / Segt Barcd 500GB >Windows 7 Ult >AutoCad, Revit, Solidworks, Sketchup, Adobe CS MC, Corel Technical Designer, WP Office, MS Office]. Gents, Please review this article: Geforce cards out perform Quadro cards in AutoCAD and MOST* Autodesk software in 3D. When it comes to 2D the results are virtually the same.
I worked for Intel and manage 250+ systems used in our BIM department and I have done extensive testing on my own as well. What you do find is that the Quadro series cards will far out perform the Geforce series in Hidden and Conceptual views inside of AutoCAD but that Wireframe and Realistic perform 4-5x better on the Geforce cards.
This is because the drivers for these two cards are intended for different purposes, Geforce is intended for speed and one sided renderings where as Quadro is intended for precision and accuracy. With Quadro series you'll notice fewer 'artifacts' or graphical display errors although the amount of these you notice on the Geforce is very minimal. I choose to use Geforce series because most of the work we do is in Wireframe and Realistic views and the difference in cost between the two series of cards is too substantial to justify. In addition the total available VRAM on the card plays a very large role when working with larger models, therefore I make sure to buy the 4GB versions of the 680's in opposed to the standard 2GB (to use it as an example). Lastly AutoCAD and I believe ALL of Autodesk programs are not multi-threaded. This means that paying out large amounts of money for a serious processor is not going to benefit you in the least when working in these programs.
They will utilize multiple threads and cores when performing a render but that is the only time when the higher priced processors are beneficial. As a result it's important to pick your processor wisely and try to remain cost effective. The Z77 platform tends to be 13-15% better at single thread processes then the X79, which means for far less money you can run your Autodesk applications faster and more efficiently.
Lastly Autodesk software DOES NOT support SLI configurations, so don't waste your money on multiple cards unless you have a genuine need for a Maximus set up for some serious computation. Edit: Passmark benchmarks are completely irrelevant when it comes to Autodesk software. RE: Quadro vs. Firepro Gentlemen?, In the Quadro Vs. GeForce conversation, it seems useful to also consider AMD Firepro workstation cards as Quadro and Firepro are the direct competitors for imaging systems >CAD, 2D modeling, animation, graphic design, industrial design, simulation, video processing and editing.
Bambiboom, what did you end up doing? Sorry to resurrect an older post?
Do you have a debriefing after all this research you went through? Did you build that rig you described in your last post? Seems like a pretty legit one so how did it GTX cards on CAD. My real question is what would you say about the Titan GPU (Nvidia) from your research. It says it supports direct x 11.1 and Open GL 4.3. Isn't this exactly what you are looking for?
A maxed out CUDA cores (2688) Card good bandwidth (384-bit) with drivers that handle both direct X and OpenGL?? Would this card not strike the balance? Perhaps you're not considering it since the price is prohibitive but if you can convince your IT department what main point would you go to them with? I'm trying to do coordination in contracting with CAD MEP 2014 and Navis Manage mostly spinning large buildings around with every trades MEP models in there so to me a Direct X focus might be what i want to be able to see what i'm working with actively. I'm not doing harsh number crunching, video editing, rendering or anything like that mostly viewing in 2d wireframe or loose geometry in Navis but i don't want to have to wait for my image to appear while i'm rotating it around.
Thanks for your answer. I have look for a DELL PRECISION Mxxx, and LENOVO. I do not need a DESKTOP at this moment. I prefer to have a mobile workstation because of my working style. It is better for me, having portability, like 15.6' with a nice resolution. Later I think I will build a desktop.
Perhaps if I ask to my boss, to build a new one for the office. I have look to Lenovo, model W530, but it is missing the num pad keys. I search about, and I found that one new Lenovo model is about to come. The new LENOVO THINKPAD W540. 'simple perfect mobile'.
And with a big resolution, about a 2880x1800, I think. Have a look please. But I am having a big problem, that is to found a direct vendor of DELL and LENOVO, in Portugal. If I have to maintenance, or resolve future issues, I have to send back by postmail service. And that is a big concern for me. So I have looking, again, for the CLEVO P150SM, that presents big performance and portability at low costs, comparing to world wide big brands. So I am confuse, if I bet on Lenovo or a DELL.
In fact, one side of me tells me to invest in some professional laptop, like DELL. But I don't know directly anyone that have one, so I can speak to him a ask about the majority big issues/problems. I don't know if it is the best choice to having a nice DELL. Perhaps I try ebay, a refurished one, for half os costs. DELL It is a nice really a better bet, in professional workstations?
They will work without any problems for the next 5/7 years? My last laptop was a an old CLEVO M570RU 17' (1900x1200); Intel Core 2Duo T7700 2,4Ghz 4Mb cache; Nvidia 8700GT 512MB MXM-III, and still working like charm. Never had an issue. Never had a crash or a blue screen. Never had incompatibilities. Easy to clean, and very upgradable. The one think bad is the weight.
But till now I never had to travel a lot with him. So, because of this kind of 'uncertainness' I am asking facts and arguments in the pc forums. If you are me, what you do!?!? Thanks for your answer. I have look for a DELL PRECISION Mxxx, and LENOVO.
I do not need a DESKTOP at this moment. I prefer to have a mobile workstation because of my working style. It is better for me, having portability, like 15.6' with a nice resolution.
Later I think I will build a desktop. Perhaps if I ask to my boss, to build a new one for the office. I have look to Lenovo, model W530, but it is missing the num pad keys. I search about, and I found that one new Lenovo model is about to come. The new LENOVO THINKPAD W540.
'simple perfect mobile'. And with a big resolution, about a 2880x1800, I think. Have a look please. But I am having a big problem, that is to found a direct vendor of DELL and LENOVO, in Portugal.
If I have to maintenance, or resolve future issues, I have to send back by postmail service. And that is a big concern for me. So I have looking, again, for the CLEVO P150SM, that presents big performance and portability at low costs, comparing to world wide big brands. So I am confuse, if I bet on Lenovo or a DELL.
In fact, one side of me tells me to invest in some professional laptop, like DELL. But I don't know directly anyone that have one, so I can speak to him a ask about the majority big issues/problems. I don't know if it is the best choice to having a nice DELL. Perhaps I try ebay, a refurished one, for half os costs. DELL It is a nice really a better bet, in professional workstations? They will work without any problems for the next 5/7 years? My last laptop was a an old CLEVO M570RU 17' (1900x1200); Intel Core 2Duo T7700 2,4Ghz 4Mb cache; Nvidia 8700GT 512MB MXM-III, and still working like charm.
Never had an issue. Never had a crash or a blue screen. Never had incompatibilities. Easy to clean, and very upgradable. The one think bad is the weight. But till now I never had to travel a lot with him. So, because of this kind of 'uncertainness' I am asking facts and arguments in the pc forums.
If you are me, what you do!?!? Architex.art, Sorry, I somehow missed seeing you had repsonded. I heard very good comments about Dell Preicision laptops and have seen on Passmark baselines that these can have ratings and 2D and 3D scores as high as very good desktops.
These often have i7 CPU's, Quadro K4000's and most importantly, have 17.3' screens. I have never had a laptop, principally as I find a 27' screen a minimum in programs such as Solidworks or Adobe CS as there are so many menus, viewports, and multiple applications at once. Your budget of 2500Eu should get a nice one. Perhaps your could find a 'New other' M6700- a system that had been purchased but almost never used. This is a fairly random selection from Ebay 11.25.13 >>and this one has an i7-3940XM Extreme Processor (3.0 GHz, 3.9 GHz Turbo), a Quadro K4000m 4GB, and a Dell Ultrasharp 17.3: screen. The price of $2,400 is equal to 1776 Eu. Again, this was selected quickly and I am not recommending this particular one, but it demonstrates how high the specification of these laptops can be.
Cheers, BambiBoom. Siddharthmukul007, As I'd hoped to relate in my original post, I'd long thought Quadros were unbeatable in every kinds of graphics applications, as you seem to do, but my subsequent comparisons of Quadro (3) and Geforce (1) cards in my use and in analysis of specifications and benchmarks of many, many others tells a very complex- hence the length of this post- and different story, mainly that graphics applications are consistently shifting to an emphasis on 3D performance and there are many other systems factors affecting the total experience. You write, ' nothing from the consumer graphics beats the qudro 4000 and above in dcc' but I'd appreciate seeing hard evidence or good descriptions of your experience of that statement as I saw many Geforce GTX cards- even some 8800GT's that surpassed the Quadro 4000 in 2D and especially 3D benchmarks. That said, I'm still open to the possibility that I haven't optimized my FX 4800- which by the way surpassed some Quadro 4000's in 2D benchmarks, and I may well find when doing my 6,000 part Solidworks assembly that Quadro's are 'unbeatable.'
So far, the numbers say otherwise. Thanks you for your reply. Cheers, BambiBoom Question mate.I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience. The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well.? I work with small components to 50k part assemblies which are beastly when it comes to performance issues. I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any.?) Thanks in advance if you take the time to answer my question.? Tubedog10X, You wrote >_____________ >'Question mate.I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience.
The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well.? I work with small components to 50k part assemblies which are beastly when it comes to performance issues. I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any.?) Thanks in advance if you take the time to answer my question.?
' The GTX 780 is a very high performance gaming card, but I believe that just as my experiments with the GTX 285, the 780 will probably not open viewports nor run high anti-aliasing factors and, overall, be strange and sluggish. In order of best first, for Solidworks >Quadro 6000, K5000, Firepro W9000, Q 5000, W8000, W7000, K4000, Q 4000, W5000, Q5000, F V3900, V4900, V5900. If you are working with Solidworks assemblies as large as 50,000 parts, I would recommend the Quadro K5000 (4GB, 256-bit, 1536 CUDA, 122W) just to have that level of capability available. Also, as the new Quadro K6000 (12GB, 384-bit, 2,880 cores, 225W!) is available ($5,000) the 6000 (6GB, 448 cores, 205W) have dropped in price and can be purchased 'reasonably', I saw them new from an amazon seller offering them new for $2,300 and there have been a few Ebay sales used under $1,000.
I think a lot of workstation users don't even do searches for the Quadro 6000 as they expect them to still be $3,600. They do take a lot of power, the system probably would need a 750W PSU or so.
After the K5000 and 6000, I'd say consider a Quadro 5000, possibly even a used one. These are sometimes now (12.13) in the $500 range. Used Quadros are tempting, as they are somewhat understressed and made for the long haul- on all the time and running full bore. I've had 5 used Quadros over the years (FX 550, 570, 580, 1800, and 4800) and never a failure. If cost is a consideration- and when is it not?- then a used Quadro 5000 (2.5GB) is a consideration and these are sometimes now in the $500 range.
I am having very good results in Solidworks (2010) with a Quadro 4000 (2GB, 256-bit, 256 cores, 122W) in an HP z420 (Xeon E5-1620 3.6 /3.8GHz, 24GB ECC1600) but so far I haven't done any large assemblies. I had a K4000 in my sights awhile, but it's 192-bit and I've always felt wider bandwidth cards- the 6000 and K5000 are 384-bit, the 4000 is 256-bit, worked better in my use. When I upgrade the z420 I'm inclined to >Xeon E5-1650 V2 (six core 3.5 / 3.9), 32GB RAM, and a Quadro K5000. I've never used a K5000, but the reviews, tests, and comments I've heard all make it seem about the best thing going.
No guarantees, but their speed in 3D is such that a K5000 probably wouldn't be too much of a slug in games. Excellent in Maya also. So, in summary, if the GTX 780 system is a great bargain, the GTX will not be useful in solidworks- though probably quite good in Inventor- might be flogged to fund a Quadro or Firepro- and for Solidworks I would vote for a Quadro K5000 or 5000 using the Solidworks partnered driver. Cheers, BambiBoom. Tubedog10X, You wrote >_____________ >'Question mate.I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience. The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well.?
I work with small components to 50k part assemblies which are beastly when it comes to performance issues. I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any.?) Thanks in advance if you take the time to answer my question.? ' The GTX 780 is a very high performance gaming card, but I believe that just as my experiments with the GTX 285, the 780 will probably not open viewports nor run high anti-aliasing factors and, overall, be strange and sluggish. In order of best first, for Solidworks >Quadro 6000, K5000, Firepro W9000, Q 5000, W8000, W7000, K4000, Q 4000, W5000, Q5000, F V3900, V4900, V5900. If you are working with Solidworks assemblies as large as 50,000 parts, I would recommend the Quadro K5000 (4GB, 256-bit, 1536 CUDA, 122W) just to have that level of capability available. Also, as the new Quadro K6000 (12GB, 384-bit, 2,880 cores, 225W!) is available ($5,000) the 6000 (6GB, 448 cores, 205W) have dropped in price and can be purchased 'reasonably', I saw them new from an amazon seller offering them new for $2,300 and there have been a few Ebay sales used under $1,000. I think a lot of workstation users don't even do searches for the Quadro 6000 as they expect them to still be $3,600.
They do take a lot of power, the system probably would need a 750W PSU or so. After the K5000 and 6000, I'd say consider a Quadro 5000, possibly even a used one. These are sometimes now (12.13) in the $500 range.
Used Quadros are tempting, as they are somewhat understressed and made for the long haul- on all the time and running full bore. I've had 5 used Quadros over the years (FX 550, 570, 580, 1800, and 4800) and never a failure.
If cost is a consideration- and when is it not?- then a used Quadro 5000 (2.5GB) is a consideration and these are sometimes now in the $500 range. I am having very good results in Solidworks (2010) with a Quadro 4000 (2GB, 256-bit, 256 cores, 122W) in an HP z420 (Xeon E5-1620 3.6 /3.8GHz, 24GB ECC1600) but so far I haven't done any large assemblies. I had a K4000 in my sights awhile, but it's 192-bit and I've always felt wider bandwidth cards- the 6000 and K5000 are 384-bit, the 4000 is 256-bit, worked better in my use. When I upgrade the z420 I'm inclined to >Xeon E5-1650 V2 (six core 3.5 / 3.9), 32GB RAM, and a Quadro K5000. I've never used a K5000, but the reviews, tests, and comments I've heard all make it seem about the best thing going. No guarantees, but their speed in 3D is such that a K5000 probably wouldn't be too much of a slug in games. Excellent in Maya also.
So, in summary, if the GTX 780 system is a great bargain, the GTX will not be useful in solidworks- though probably quite good in Inventor- might be flogged to fund a Quadro or Firepro- and for Solidworks I would vote for a Quadro K5000 or 5000 using the Solidworks partnered driver. Cheers, BambiBoom Man, that helps me a ton mate.! Thanks for taking the time to write.! I'm not that much of a serious gamer though I have a high quality game in Diablo D3 that I play around with on occasion. My main concern is that I have a decent video card that plays nice with Solidworks first - then any games I run will also benefit by having a good video card.?
There are a handful of other games I'm thinking of getting so that is why I'm concerned over the long haul.? I guess when it comes to video cards, I have to experience the types of things you have to really know the differences. But I thank you again for the advice.!
Cheers mate.! Tubedog10X, _____________ >'Question mate.I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience. The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well.? I work with small components to 50k part assemblies which are beastly when it comes to performance issues.
I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any.?) Thanks in advance if you take the time to answer my question.? ' The GTX 780 is a very high performance gaming card, but I believe that just as my experiments with the GTX 285, the 780 will probably not open viewports nor run high anti-aliasing factors and, overall, be strange and sluggish. In order of best first, for Solidworks >Quadro 6000, K5000, Firepro W9000, Q 5000, W8000, W7000, K4000, Q 4000, W5000, Q5000, F V3900, V4900, V5900. If you are working with Solidworks assemblies as large as 50,000 parts, I would recommend the Quadro K5000 (4GB, 256-bit, 1536 CUDA, 122W) just to have that level of capability available. Also, as the new Quadro K6000 (12GB, 384-bit, 2,880 cores, 225W!) is available ($5,000) the 6000 (6GB, 448 cores, 205W) have dropped in price and can be purchased 'reasonably', I saw them new from an amazon seller offering them new for $2,300 and there have been a few Ebay sales used under $1,000.
I think a lot of workstation users don't even do searches for the Quadro 6000 as they expect them to still be $3,600. They do take a lot of power, the system probably would need a 750W PSU or so. After the K5000 and 6000, I'd say consider a Quadro 5000, possibly even a used one. These are sometimes now (12.13) in the $500 range. Used Quadros are tempting, as they are somewhat understressed and made for the long haul- on all the time and running full bore. I've had 5 used Quadros over the years (FX 550, 570, 580, 1800, and 4800) and never a failure. If cost is a consideration- and when is it not?- then a used Quadro 5000 (2.5GB) is a consideration and these are sometimes now in the $500 range.
I am having very good results in Solidworks (2010) with a Quadro 4000 (2GB, 256-bit, 256 cores, 122W) in an HP z420 (Xeon E5-1620 3.6 /3.8GHz, 24GB ECC1600) but so far I haven't done any large assemblies. I had a K4000 in my sights awhile, but it's 192-bit and I've always felt wider bandwidth cards- the 6000 and K5000 are 384-bit, the 4000 is 256-bit, worked better in my use. When I upgrade the z420 I'm inclined to >Xeon E5-1650 V2 (six core 3.5 / 3.9), 32GB RAM, and a Quadro K5000. I've never used a K5000, but the reviews, tests, and comments I've heard all make it seem about the best thing going. No guarantees, but their speed in 3D is such that a K5000 probably wouldn't be too much of a slug in games. Excellent in Maya also. So, in summary, if the GTX 780 system is a great bargain, the GTX will not be useful in solidworks- though probably quite good in Inventor- might be flogged to fund a Quadro or Firepro- and for Solidworks I would vote for a Quadro K5000 or 5000 using the Solidworks partnered driver.
Cheers, BambiBoom. Just joined this forum and am just learning solidworks - on version 2013. Was able to get a student version to use with as a function of an online course on home computer.
My home computer is an Alienware Aurora R4 - i7 3.6 GHz 16 GB RAM 500 MB solid state drive, running a gtx 680 2 GB graphics card. Reading through the posts here looks like not the best choice, but its what I have.
My question is whether I'll notice/have trouble running solidworks at a very low essentials/learning end. Swgmusr, As far as CPU, RAM >You don't mention the OS you're using, but it's important to check as Solidworks is only recently getting Windows 8 and 8.1 compliance. An important component to verify will be the graphics card. You may wish to load and test your version of Solidworks before taking any action, but I suspect there may be some problems with running some viewport features on the GTX 680.
These viewports produce orthographic projections of 3D model parts and assemblies and because they're also involved with the drawing process documentation, are important. The anti-aliasing may be limited also, and /or problems in assembly animation. I'm not sure as I haven't used 2013, and both my systems have Quadros (FX 4800 and 4000) which can run the special Solidworks drivers.
On my Dell Precicion T5400, I use the Solidworks partnered driver and run Solidworks 2010 64-bit. Before buying a card though, first thing would be to dicuss it with your instructor, and secondly, give it a try. Possibly the GTX 680 will not restrict learning the program up to level necessary- difficult to say, but you might nose around for an inexpensive Quadro or Firepro. Actually something even three generations old, like a Quadro FX 3800 will work quite well. Solidworks is a very good choice to learn >one of the best applications of any kind that I know and an industrial standard. As a designer, I still am not a sophisticated user myself, but would like to be as I have a couple of complex projects in the design phase- many pieces. If approached logically, I think Solidworks is easier to learn than Inventor- and has a higher capability than Rhino for very complex projects that also need a complete documentation.
Cheers, BambiBoom. Bambibloom: Thanks for the response. My operating system is Win 7 Ultimate 64bit. Looking over some threads on the solidworks forums it looks like its hit and miss, sometimes driver dependent.
Just going to load and try it and see if it works. There was some discussion that 6xx series on helps out. Swgmusr swgmusr, Yes, as mentioned, it's possible that the GTX 680 may get you far enough at the study level, and giving Solidworks a test drive with the 680 first will be the best method to avoid unnecessary expenditure. Solidworks is surprisingly intuitive- I got started quickly from a few YouTube videos and learned enough to accomplish some work and also to understand the astounding range of capabilities and that I may never master it! I have lately thought of trying Rhino, which is easy to use, has good rendering capabilities and also about about $5,000 less expensive.
If you are planning to maintain an industrial design capability after you study, you might consider Rhino instead of Solidworks as $1,000 is much easier to mange than $6-7,000, and the support subscription of several thousand per year, especially if it's not your main work. Of course, there's always Catia (also made by Dessault)- $16-30,000 plus $6,000 per year. I am gradually switching from architecture to industrial design, and so wanted to learn an industry standard, but if you will be doing occasional projects on your own or with only one or two others, consider a trial with Rhino. Rhino may be a better long term choice >when the student license expires the bill to move to a full Solidworks commercial license can really sting! The Solidworks forums are packed with true believer, extreme users that can answer more specifically than I, but best of all will be to discuss this with the instructor, especially if there is quirky behavior.
I did not have good luck with my GTX (285) flirtation in CAD-and especially Solidworks and Sketchup, but CAD applications are changing their video approach >all over the map and some Autodesk programs have gone OpenGL, DirecX, and for example Inventor 2013 would really fly on a GTX 680. Cheers, BambiBoom. @BambiBoom - I must say that your knowledge of computers and CAD stuff is very impressive.! I am learning a ton just reading through your threads here and I appreciate the time you take to give us wannabe's more detailed information.cheers mate.!
I am currently in the process of building my dream system for mainly CAD use. I own a seat of Solidworks for my business and I need a system that is stable and performance based. I recently learned that Solidworks will only support Windows 7 and up from 2014 on out. Does this variable change any of the opinions you've shared with us so far.? In other words, will only being able to use Windows 7 or 8 for Solidworks in the near future be another driving issue based on graphic card integrity.? Or does this not matter at all.?
The reason I ask is if I build a new rig, I'm hoping it will last at least 3-5 years (and beyond if that's possible.?) so I want to make sure I get the right OS installed as well.? Thanks again for all your insights.it has been very helpful.!
Tubedog10x, Thank you for the encouraging words- I wish I were an expert, but I often feel with complex applications like Solidworks that the more I learn the less I believe I know. Having Solidworks for your work is an accomplishment in itself and both a fantastic programme and a commitment to very high quality project process. CAD has been a useful discipline for me as I am in the main a scribbler / sketcher / designer, and CAD enforces accuracy. As to the subject of OS compatibility, Solidworks has worked on Windows 7 a good while- I've used 2010 64-bit on 7 Ultimate since then and it's been Windows 8 compatible since V 2013-2014 and I think 8.1 for 2014-15. I'll be giving Windows 8 a miss entirely- I just don't see the point and as I have 75 desktop icons, I could never fit all those big icon/panels anyway- I'd have to use the Win7 lookalike version. The graphics card suggestions I think should be consistent for Windows 7 and 8 use, but as noted, the future 5 or 6 years on is not as clear as CAD applications are leaning inconsistently in both the OpenGL and CUDA directions.
The choices have never been more complicated and I think it's difficult to find a graphics card that can do everything well. As the performance improves, the optimization required means the cards are more specialised and choices have to be made with care working backwards from the applications. AutoCad 3D and Inventor will do well on a GTX, but Maya will run better on a two generation past 1GB Quadro 2000 than a 6GB GTX Titan. Solidworks, however is consistent and the hierarchy remains >Quadro 6000, K5000, Firepro W9000, Q 5000, W8000, W7000, K4000, 4000 and so on.
Personally, I would go beyond the Quadro 4000 for Solidworks except in an older version such as I use (2010), the FX 5800 (4GB and 512-bit). In my new Hp z420, I have a Quadro 4000 and it is at about the right level for my use of 2010, but if I were to name the ideal all-rounder to have today for all CAD, rendering, video, graphic design- and probably will even run games reasonably is >the Quadro K5000, and with the previous generation Quadro 5000 and K4000 not far behind. Something that I can only attribute to personal and anecdotal experience- not the numbers- is that cards with a wider bandwidth- 256-bit and above always seem to handle the bigger programmes and giant files better. The highest bandwidth was the Quadro FX 5800 (4GB) made for video editing and that was 512-bit. The Quadro 4000 and K5000 are 256-bit while the K4000 is 192-bit, and it seems that the 5000 which is 320-bit does very well. In Solidworks benchmarks, the 2.5GB previous generation 5000 can outperform the current 3GB K4000.
Is the 320-bit instead of 192-bit the reason? In my old system dell Precision T5400, I use an FX 4800 at 384-bit- also a great card- I can't explain it except to say that it just feels as though the pipeline is more open. I note that the new 12GB Quadro K6000 ($5,000) is 384-bit as was the Quadro 6000 and the FX 5800 and 4800. Best of luck with your new system. I'd be interested to know the final specification and also the other programmes you're using.
Cheers, BambiBoom. There is something seriously rotten/wrong/odd about the workstation card market. For instance, supposedly high-powered $1000 Quadro 4000 has some ridiculously underwhelming specs (625 MHz core for starters - and this is a normal Fermi core, one generation OLDER than the Kepler core which the GTX 680 runs at 1100 MHz!). All nVidias cards use the same chips, but as far as I can tell nVidia has *severely* crippled the consumer line GTX cards in order to sell Quadro cards, but the Quadros are _also_ crippled in order to sell Tesla cards. NVidia's wettest dream, embodied in the Maximus drivers, is that you buy a Quadro to get decent display speed, and then a Tesla to get decent computational speed.
Even with this crippling it seems nVidia is extremely unwilling to let Quadros face off against Geforce cards. I can see no other reason that there isn't _a single benchmark on the entire net_ which compares Quadros to Geforces to Radeons to Fire Pros on computational speed (and I'm not talking 3DMark score or framerate in Crysis 3 here, but single and double precision number crunching through CUDA and OpenCL). My impression is that when you buy a Quadro you pay for the driver and the firmware - the Quadro cards are clearly very weak, but still probably manage to outperform the *far* more capable Geforce hardware with intentionally crippled firmware and drivers. This seems to be what all the 'Quadros suck at games' and 'get a Quadro for graphics work' is really about: the Quadros are slow as heck, but their maths capabilities less crippled than the faster Geforce's. NVidia is abusing the market because it has no serious competition.
That is probably the most informative/concise post I have found about the comparison between Geforce and Quadro. Thanks for your answer. I have look for a DELL PRECISION Mxxx, and LENOVO. I do not need a DESKTOP at this moment. I prefer to have a mobile workstation because of my working style. It is better for me, having portability, like 15.6' with a nice resolution.
Later I think I will build a desktop. Perhaps if I ask to my boss, to build a new one for the office.
I have look to Lenovo, model W530, but it is missing the num pad keys. I search about, and I found that one new Lenovo model is about to come. The new LENOVO THINKPAD W540.
'simple perfect mobile'. And with a big resolution, about a 2880x1800, I think. Have a look please. But I am having a big problem, that is to found a direct vendor of DELL and LENOVO, in Portugal. If I have to maintenance, or resolve future issues, I have to send back by postmail service. And that is a big concern for me. So I have looking, again, for the CLEVO P150SM, that presents big performance and portability at low costs, comparing to world wide big brands.
So I am confuse, if I bet on Lenovo or a DELL. In fact, one side of me tells me to invest in some professional laptop, like DELL. But I don't know directly anyone that have one, so I can speak to him a ask about the majority big issues/problems.
I don't know if it is the best choice to having a nice DELL. Perhaps I try ebay, a refurished one, for half os costs.
DELL It is a nice really a better bet, in professional workstations? They will work without any problems for the next 5/7 years?
My last laptop was a an old CLEVO M570RU 17' (1900x1200); Intel Core 2Duo T7700 2,4Ghz 4Mb cache; Nvidia 8700GT 512MB MXM-III, and still working like charm. Never had an issue. Never had a crash or a blue screen. Never had incompatibilities. Easy to clean, and very upgradable. The one think bad is the weight.
But till now I never had to travel a lot with him. So, because of this kind of 'uncertainness' I am asking facts and arguments in the pc forums. If you are me, what you do!?!? Thanks for your answer. I have look for a DELL PRECISION Mxxx, and LENOVO. I do not need a DESKTOP at this moment.
I prefer to have a mobile workstation because of my working style. It is better for me, having portability, like 15.6' with a nice resolution. Later I think I will build a desktop. Perhaps if I ask to my boss, to build a new one for the office.
I have look to Lenovo, model W530, but it is missing the num pad keys. I search about, and I found that one new Lenovo model is about to come. The new LENOVO THINKPAD W540. 'simple perfect mobile'. And with a big resolution, about a 2880x1800, I think. Have a look please. But I am having a big problem, that is to found a direct vendor of DELL and LENOVO, in Portugal.
If I have to maintenance, or resolve future issues, I have to send back by postmail service. And that is a big concern for me.
So I have looking, again, for the CLEVO P150SM, that presents big performance and portability at low costs, comparing to world wide big brands. So I am confuse, if I bet on Lenovo or a DELL.
In fact, one side of me tells me to invest in some professional laptop, like DELL. But I don't know directly anyone that have one, so I can speak to him a ask about the majority big issues/problems.
I don't know if it is the best choice to having a nice DELL. Perhaps I try ebay, a refurished one, for half os costs. DELL It is a nice really a better bet, in professional workstations? They will work without any problems for the next 5/7 years? My last laptop was a an old CLEVO M570RU 17' (1900x1200); Intel Core 2Duo T7700 2,4Ghz 4Mb cache; Nvidia 8700GT 512MB MXM-III, and still working like charm. Never had an issue. Never had a crash or a blue screen.
Never had incompatibilities. Easy to clean, and very upgradable. The one think bad is the weight. But till now I never had to travel a lot with him.
So, because of this kind of 'uncertainness' I am asking facts and arguments in the pc forums. If you are me, what you do!?!? MSI Computer, well known for their gaming notebooks, makes Mobile Workstations. They do not have the market share like Dell or HP, but their machines have better features like the Steelseries backlit programmable keyboard, raid 0,1, and 5, coolerboost technology, and etc. And i think they are cheaper. You will also find that MSI will often give you higher end Quadro GPU than HP, Dell, or Lenovo at around same or less pricing. Back to your question.
MSI GT60 2OKWS-278US is a 15.6' with 3K IPS display giving you 2880 x 1620 with Quadro K3100m. This might be something you might want to look. Link: GT60 2OKWS-278US •Windows 7 Professional •Intel® Core™ i7-4700MQ Processor •15.6' WQHD+ 3K Display (16:9; 2880 x 1620) •NVIDIA® Quadro® K3100M (4GB DDR3 VRAM) •Matrix Display (4K support on all external displays) (upto 3) •Cooler Boost 2 •Full-Color Programmable Backlit Keyboard by SteelSeries •Killer™ Doubleshot (Killer E2200™ Networking + Killer™ Wireless-N1202) •128GB SSD + 1TB HDD (7200RPM) •16GB DDR3L 1600MHz System Memory •USB 3.0 x 3; USB 2.0 x 1 •HDMI, mDP x2 •Blu-ray Disc Burner •Built-in 720p HD Webcam •World-Class Dynaudio Premium Speakers •Audio Boost. Architex.art, Sorry, I somehow missed seeing you had repsonded. I heard very good comments about Dell Preicision laptops and have seen on Passmark baselines that these can have ratings and 2D and 3D scores as high as very good desktops. These often have i7 CPU's, Quadro K4000's and most importantly, have 17.3' screens.
I have never had a laptop, principally as I find a 27' screen a minimum in programs such as Solidworks or Adobe CS as there are so many menus, viewports, and multiple applications at once. Your budget of 2500Eu should get a nice one. Perhaps your could find a 'New other' M6700- a system that had been purchased but almost never used.
This is a fairly random selection from Ebay 11.25.13 >>and this one has an i7-3940XM Extreme Processor (3.0 GHz, 3.9 GHz Turbo), a Quadro K4000m 4GB, and a Dell Ultrasharp 17.3: screen. The price of $2,400 is equal to 1776 Eu. Again, this was selected quickly and I am not recommending this particular one, but it demonstrates how high the specification of these laptops can be. Cheers, BambiBoom Hello again my friend. So long time pass since our last posts.
From PORTUGAL?!?!? I hope so.:-) Now i have to say some serious ISSUE here.
Finally I bought one LAPTOP to work. I fail in choose the GTX 780M.
IT´s REALLY SUCKS IN 3D VIEWPORT and 3D APPS. It is a CLEVO P150SM (SAGER in USA) SKETCHUP, simply do not get a smooth 3D ROTATE, SPECIALY if i turn on SHADOWS and SHADERS. KEPPLER chip in Geforce GTX,is a great great shit. I can not have my work done. I have tested, and my old GT 8700M works similar as thi GTX 780M (very high gamer card) I feel so stupid to no choose the K3100M. I need a smooth VIEWPORT that GTX 780m wont be able to perform.
YES, GTX 780M is very very fast, but in DIRECTX.OpenGL sucks.3D APPS like SKETCHUP, or RHINO, it impossible to have shader on, and all layer on to. For the people u is decided to WORK in 3D, OpenGL and so on,,, go for a QUADRO. Even a K2000M is better then this GTX780M crape.this was my poor choice in all my hardware configuration. Just because i have read some post tha say bad about QUADRO drivers. They want to mean that if you do not have the correct driver to perform with a QUADRO in a specific SOFTWARE, then you do not will see any PERFORMANCE there. I need to sell my GTX 780M to get a K3100M.
My laptop config are: i7 4710MQ 2,5ghz - 3,5ghz 15,6' Full HD 95% High Color Gamut GTX 780M -4Gb Gddr5 (a lots of cudas, for nothing) 16Gb RAM G.Skills 1600Mhz CL9 SSD Samsung 840 PRO, 256Gb this configuration, specially this graphic card CAN`T handle my 2MB Sketchup file, with a 1 million edges. Anyone have something to say and help me here. I really have to finish my job, and GTX 780M simply can`t handle that. STUPID CARD!!!! I am furiouse with my self, just to buying this card.
But i have to find, if some one have a QUADRO working on RHINO 5 or SKETCHUP, and please tell me HOW it is the WORKFLOW in realtime VIEWPORT? Just to continue my last post.
I work mostly on PHOTOSHOP CS5, AUTOCAD 2010 (recently install autocad 2012 to test the GTX 780M), SKETCHUP 2013, RHINO 5, ARTLANTIS RENDER. Sometimes i use ILLUSTRATOR, INDESIGN, but mostly the all the firsts one. So.what can i do to have a really nice smooth VIEWPORT workflow in my 3D files. I need to have it. And i can afford any K5000M or K6000M. Does the K3100M perform better in VIEWPORT workflow on that APPS?
I am in the 15th day period testing day for my new laptop. And i ask at the shop if its possible to change my GTX780M for that K3100M. Now i am waiting for the shop answer. I have choose the GTX780, perhaps it is more versatile. AUTOCAD 2010 and 2012 rocks on that card, because it is a DIRECTX.
Photoshop CS5 is other apps that perform EXCELENT, because of its CUDA engine. ARTLANTIS, works fine, because it is mostly RAM and CPU.
But SKETHCUP and RHINO are OpenGL. And GTX 780M just can`t handle. Can anyone argue if i change my GTX780M for the QUADRO K3100M will justify my needs? I have read about 'Quadro drivers are to old'. And if we don't have the correct drive they simply don't work like as we expect. IS this True?
I have to decide. So any help will be greatfull. Ps: sorry my bad and unpractice English. But i am from PORTUGAL.
Best regards.:-). Architex_art, Good to hear from you again. To choose a graphics card is now more difficult than ever.
Actually, I intend to rewrite the opening post of this thread to reflect the new situation. The programs are more complex and have more features, the files are larger, and different program use different technologies- Direct X, Open GL / CL to have good performance.
I have made an effort to find a simple way to switch between a Quadro and a GTX so that I can have good performance in all programs. However, this appears to be difficult- I would have to change the primary card in BIOS for every switch and there may be conflicts having for example both Quadro and GTX drivers on the same OS, meaning a dual boot configuration. I don0t have personal experience with mobile professional graphics cards. I did however check Autodesk recommended hardware and also Passmark Performance Test baselines where I often see results for mobile GPU's. Looking at the recommendations on Autodesk products, the Quadro 3000M and 3100M are consistently recommended and certified.
On Passmark, a search for systems using an i7-4700MQ and Quadro 3100M showed several system with good 2D and 3D scores: Precision M6800 scoring 2D / 3D of 837 / 2591 Precision M6800 >819 / 1457 Hp zBook 17 >785 / 2355 Hp zBook 17 >851 / 1454 For reference, the 3D / 3D scores for my Xeon E5-1620 (3.6 / 3.8GHz), 24GB ECC 1600, Quadro 4000 (2GB) system were 839 / 2048 so the 4GB Quadro 3100M is doing quite well. I am surprised though that one Precision M6800 would score 2591 in 3D and another only 1457. Also one HP zBook scores 2355 and the other 1454. That difference is very large for two systems with the same CPU and graphics card. A 3D score of 2591 is within the range of score for a desktop Quadro K4000 or 5000 (not K5000), but 1454 is something like a desktop Quadro 2000. The setup may need special attention.
Perhaps the two low 3D scores are because of power saving settings. So, overall, it appears that the Quadro 3100M can be very good. I am not sure about the comment on Quadro Drivers being old. The drivers do have a basis / foundation that is old so that old Quadros can continue to use them. I run both a 2003 Quadro FX580 and 2014 Quadro 4000 using the same driver, but they are often updated and there are specialized 'partnered' drivers for programs such as Solidworks.
One thing that bothers me a bit about Quadro software is that the management and updating run a lot of processes all the time. Also, NVIDIA will load all kinds of 3D (stereo) software and other controls that I don't use, becuase the applications have theire own settings. I went into msconfig and turned off about six or seven items that were running in the background all the time. After doing this, in Passmark Performance Test, the 2D score changed from 767 to 839 and the 3D from 2044 to 2048. Also, I would mention that the Windows Aero theme is a terrible waste of GPU power. With my previous Quadro FX4800, running an Aero Theme with transparency reduced the overall graphics scores by almost 25%.
Overall, I think the Quadro 3100M can be quite good but I suggest careful setup for best performance. A word about Sketchup.
The more I use Sketchup, the more I believe it is a quite limited program. I have a project with an 80MB that is complex -many 3D trees. Even with a 3.8GHz Xeon CPU, 24GB 1600 RAM and Quadro 4000 (Passmark system rating =3923) on the 80MB file, I have to turn off every layer and large component except the one or two I am changing, work in monochrome, and still the model is almost impossible to work with as the latency in navigation and making changes- especially moving large objects means things have to be done again and again.. Sketchup locks up and has to be restarted about three times per hour. At this point, I can't really imagine a system that runs Sketchup very quickly and reliably. If anyone knows - please write! I can never hope to create an animation.
There are great things about Sketchup, but I will not be using it again except for limited-size projects. If you do buy the Quadro 3100M I would enjoy knowing if you like it. Boa sorte meu amigo, BambiBoom. Architex.art, Sorry, I somehow missed seeing you had repsonded. I heard very good comments about Dell Preicision laptops and have seen on Passmark baselines that these can have ratings and 2D and 3D scores as high as very good desktops. These often have i7 CPU's, Quadro K4000's and most importantly, have 17.3' screens. I have never had a laptop, principally as I find a 27' screen a minimum in programs such as Solidworks or Adobe CS as there are so many menus, viewports, and multiple applications at once.
Your budget of 2500Eu should get a nice one. Perhaps your could find a 'New other' M6700- a system that had been purchased but almost never used. This is a fairly random selection from Ebay 11.25.13 >>and this one has an i7-3940XM Extreme Processor (3.0 GHz, 3.9 GHz Turbo), a Quadro K4000m 4GB, and a Dell Ultrasharp 17.3: screen. The price of $2,400 is equal to 1776 Eu.
Again, this was selected quickly and I am not recommending this particular one, but it demonstrates how high the specification of these laptops can be. Cheers, BambiBoom Hello again my friend. So long time pass since our last posts. From PORTUGAL?!?!? I hope so.:-) Now i have to say some serious ISSUE here.
Finally I bought one LAPTOP to work. I fail in choose the GTX 780M. IT´s REALLY SUCKS IN 3D VIEWPORT and 3D APPS.
It is a CLEVO P150SM (SAGER in USA) SKETCHUP, simply do not get a smooth 3D ROTATE, SPECIALY if i turn on SHADOWS and SHADERS. KEPPLER chip in Geforce GTX,is a great great shit.
I can not have my work done. I have tested, and my old GT 8700M works similar as thi GTX 780M (very high gamer card) I feel so stupid to no choose the K3100M. I need a smooth VIEWPORT that GTX 780m wont be able to perform. YES, GTX 780M is very very fast, but in DIRECTX.OpenGL sucks.3D APPS like SKETCHUP, or RHINO, it impossible to have shader on, and all layer on to. For the people u is decided to WORK in 3D, OpenGL and so on,,, go for a QUADRO. Even a K2000M is better then this GTX780M crape.this was my poor choice in all my hardware configuration.
Just because i have read some post tha say bad about QUADRO drivers. They want to mean that if you do not have the correct driver to perform with a QUADRO in a specific SOFTWARE, then you do not will see any PERFORMANCE there. I need to sell my GTX 780M to get a K3100M. My laptop config are: i7 4710MQ 2,5ghz - 3,5ghz 15,6' Full HD 95% High Color Gamut GTX 780M -4Gb Gddr5 (a lots of cudas, for nothing) 16Gb RAM G.Skills 1600Mhz CL9 SSD Samsung 840 PRO, 256Gb this configuration, specially this graphic card CAN`T handle my 2MB Sketchup file, with a 1 million edges.
Anyone have something to say and help me here. I really have to finish my job, and GTX 780M simply can`t handle that. STUPID CARD!!!!
I am furiouse with my self, just to buying this card. But i have to find, if some one have a QUADRO working on RHINO 5 or SKETCHUP, and please tell me HOW it is the WORKFLOW in realtime VIEWPORT? I am not sure the performance in Rhino5, Sketchup and etc but so far, i have not heard any complaints from hundreds of workstations we sold. Below is my recommendation of MSI Mobile workstation with 3K (2880x1620 ) resolution display. GT60 2OKWS 3K-615US Retail $2,699.00 in US. Intel I7-4800mq 16GB DDR3L (upgradable to 32GB) 128GB mSATA SSD + 1TB 2.5' 7200rpm (upgradable to 3 x mSATA SSD + 1 x 2.5' Raid 0,1,5) Quadro K3100m GPU w/ 4GB 15.6' 3K IPS Display 2880 x 1620 resolution Bluray Writer is standard in all MSI workstations Killer Networks 1 x HDMI, 2 x mini Display Port (supports 3 external display upto 4K resolution) USB 3.0 x 3, USB 2.0 x 1 Steel Series Gaming Backlit Keyboard Dyaudio 2 speakers + Subwoofer 9 cell battery windows 7 pro 2 year warranty + 1 Yr Accidental Damage.
GTX vs GT vs QUADRO - GRAPHIC CARDS Benchmarks scores HOLOMARK2 in RHINOCEROS 5 SR8: GeForce GT 8700M 512MB GDDR3 + intel core 2Duo T7700 2,4Ghz GeForce GTX780M 4Gb GDDR5 + intel i7 4710MQ 2,5Ghz - 3,5Ghz Nvidia QUADRO K3100M - intel i7 4930MX 3,00Ghz I think this proves how Nvidia fabric turn GTX into an evident gaming card. Only for games.
We users, no more have opportunities to buy a versatile card, like good old times. The way I see it is we only have 2 ways: choose between a 'gamer' or 'worker'. GTX and Keppler are a big SHIT for 3D OpenGL works. BIG BIG SHIT.
A waste of money. The difference between a workstation card and a gaming card is very simple (and complex at the same time). Similarities: 1. Workstation cards use the exact same hardware as gaming cards -- they are the same hardware capable of performance just as fast as the other. In order to make more money, both Nvidia and ATI charge an insane amount of money for workstation cards (despite having the same hardware).
Differences: 1. Unlike AMD, Nvidia deliberately cripples the double precision floating point performance on their gaming cards.
Nvidia's workstation cards have faster double floating point performance only because the gaming cards are crippled, not because it's different hardware. Both AMD and Nvidia include specialized drivers for workstation cards. They include things like more precise OpenGL rendering (perhaps by utilizing the double precision floating point that was crippled on gaming cards??). A number of high profile software companies have partnered with Nvidia and AMD to gouge the customer by optimizing their software specifically (and ONLY) for the workstation card's drivers.
This means that in some programs the workstation cards will perform significally better than a gaming card. But only because these companies have worked together to cheat you.
Alternatively, if you find a program that is not from a big name company, there is a good chance it may actually perform significantly better on a gaming card (especially one from AMD without the double precision floating point crippled) because you can often afford a much faster gaming card for much less money than you can a workstation card. [of course it all depends on the double precision floating point] Long story short. If you have to use one of the programs specifically designed to only perform well on workstation cards, then you have no choice but to get a workstation card. The only way to know for sure is to find real world benchmarks comparing workstation and gaming cards for a particular program you want to use.
Unfortunately, this kind of information is difficult to find. September Notes: An update for anyone who might read this: There seems to be another difference as well: * Workstation cards may use ECC (error correction) memory, while gaming cards may not * The lower floating point performance on gaming cards seems to be the main (if not only) other key difference. Here are some actual benchmarks comparing the performance of a workstation and a gaming card in different programs: Conclusion? Compare performance: * Check the gaming vs workstation benchmarks for your program. * If you can't find a benchmark, try to find out if the program relies heavily on double precision floating point or was made to only work on a workstation card. When to get a workstation card for image quality or precision reasons: * using the video card to render video or 3d scenes.
Where the use of floating point may affect the final quality. Since you are saving the rendered results, this might be an important reason. * viewing things in 3D where minor accuracy issues in the model (due to double precision floating point) may matter; in general, this really doesn't matter most of the time.