Tech Posts RX 5600 XT vs. GTX 1660 Ti vs. GTX 1080: The Ultimate 2025 Used GPU Showdown July 15, 202524 views0 By das Share Share AMD has been well known enough for a wide range of graphic cards and other accessories. From among the vast collection of printed cards that it has on offer, the recent AMD Radeon RX 5600XT has been one of the latest additions. Note: If you buy something from our links, we might earn a commission. See our disclosure statement. The graphic card promises high-end features and focusses on competing with a few of the powerful options from Nvidia. It competes effectively with Nvidia GTX 1660 Ti and GTX 1080. Let us check out how it compares with the Nvidia counterparts. A Generational Crossroads: RX 5600 XT vs. GTX 1660 Ti vs. GTX 1080 Faceofit.com A Generational Crossroads A Comparative Analysis of the AMD RX 5600 XT, NVIDIA GTX 1660 Ti, and NVIDIA GTX 1080 in the Modern Used Market This analysis provides a definitive comparison of three pivotal graphics cards now competing in the budget-conscious second-hand market. The central conflict is a classic trade-off: do the modern efficiencies of the Radeon RX 5600 XT and GeForce GTX 1660 Ti outweigh the legacy power and larger VRAM of the venerable GeForce GTX 1080? With these cards converging on a similar price point, the choice hinges on a nuanced understanding of raw performance, modern features, VRAM capacity, power efficiency, and productivity workloads. This report will systematically dissect each factor to provide a clear, evidence-based verdict. A Tale of Three Architectures GeForce GTX 1080 Pascal (16nm) A "trickle-down flagship" whose formidable power comes from its original premium design. Embodies a "brute force" philosophy with a high core count and a wide 256-bit memory bus. GeForce GTX 1660 Ti Turing (12nm) A "purpose-built mid-ranger" focused on efficiency. It incorporates modern shader improvements but omits RT/Tensor cores to hit a specific price and power target (120W TDP). Radeon RX 5600 XT RDNA 1.0 (7nm) A "binned-down high-end die" on the most advanced process. It uses the same powerful silicon as the RX 5700, but is constrained by a narrower memory interface. Comparative GPU Specifications Feature GeForce GTX 1080 GeForce GTX 1660 Ti Radeon RX 5600 XT (Updated) Cores/SPs 2560 CUDA Cores 1536 CUDA Cores 2304 Stream Processors VRAM 8 GB GDDR5X 6 GB GDDR6 6 GB GDDR6 Memory Bus 256-bit 192-bit 192-bit Bandwidth 320 GB/s 288 GB/s 336 GB/s TDP 180 W 120 W 150 W Rasterization Performance: The Gaming Gauntlet 1080p Gaming 1440p Gaming Relative average FPS at High/Ultra settings. Data synthesized from multiple sources. The VRAM Dilemma: The Most Critical Factor for Longevity The 6GB Bottleneck While adequate at launch, 6GB of VRAM is now a significant liability. Modern AAA games, developed for consoles with large memory pools, can easily exceed this capacity even at 1080p. This leads to severe stuttering and forces users to lower texture quality, diminishing the visual experience. The GTX 1080's 8GB frame buffer provides crucial headroom, making it a paradoxically more "future-proof" and resilient card in the current market. 6 GB High Risk RX 5600 XT / GTX 1660 Ti: Insufficient for high textures in many new titles. 8 GB GTX 1080: The effective minimum for a smooth, high-fidelity 1080p experience. Productivity & Content Creation Streaming & Video The GTX 1660 Ti wins with its modern Turing NVENC encoder, offering superior quality and efficiency for OBS and Premiere. 3D Rendering (Legacy) The GTX 1080 holds an edge due to its higher CUDA core count (2560), which benefits legacy non-ray-traced renderers. Power, Thermals & Efficiency GTX 1660 Ti 120W RX 5600 XT 150W GTX 1080 180W The GTX 1660 Ti is the efficiency king, making it ideal for SFF builds. The GTX 1080 requires a more robust PSU and case cooling. Advanced Topics & Community Questions GeForce GTX 1080: Taming the Pascal Legend in 2025 🔍 How do I fix the DisplayPort 1.4 flicker with recent 2025 drivers on Pascal? This is a known issue, and the fix is a firmware update. Some users with high-refresh-rate monitors using DisplayPort 1.3 or 1.4 may experience screen flickering or a blank screen at boot with modern drivers. This is due to the card's original VBIOS not being fully compatible with the latest DisplayPort standards. NVIDIA released a dedicated firmware updater to fix this. Download the Tool: Search for "NVIDIA Graphics Firmware Update Tool for DisplayPort 1.3 and 1.4". Download it directly from NVIDIA's support website. Run as Administrator: Close all other applications. Right-click the downloaded .exe file and select "Run as administrator." Follow Prompts: The tool will automatically detect if your GTX 1080 needs the update. Follow the on-screen instructions to flash the new firmware. The process is quick and usually takes less than a minute. Reboot: Once complete, restart your computer. The flickering issue should be resolved. This is a one-time update that permanently fixes the incompatibility. 🔍 How can I optimize my GTX 1080 for VRChat on a Valve Index to prioritize frametime over FPS? For VR, smooth frametimes are everything. In CPU-heavy and poorly optimized worlds in VRChat, maintaining a stable frametime to avoid stutter and re-projection is more important than raw FPS. The goal is to lock your frame rate to half the headset's refresh rate (e.g., 45 FPS for a 90Hz Index) and deliver each frame consistently. Target a Stable Undervolt: Use MSI Afterburner's curve editor (Ctrl+F). Instead of pushing for maximum clocks, find a moderate, stable clock speed at a low voltage. A common sweet spot for Pascal is ~1860MHz at 900mV. This reduces power spikes and heat, which are major causes of inconsistent frametimes. Set a Framerate Cap: In the NVIDIA Control Panel, go to "Manage 3D Settings" > "Program Settings." Select VRChat and set "Max Frame Rate" to just below half your headset's refresh rate (e.g., 44 FPS for a 90Hz display). This prevents the GPU from rendering unnecessary frames and creating unstable loads. Adjust SteamVR Settings: In SteamVR, set the per-application resolution for VRChat to a fixed 100%. Disable advanced supersampling and motion smoothing initially. This gives you a consistent baseline to work from. In-Game VRChat Settings: Lower the "Avatar Display" settings significantly. Limit the number of avatars shown and their performance details. This is often the biggest cause of CPU and GPU bottlenecks. 🔍 Is it possible to undervolt a GTX 1080 for sub-10W idle power draw in Windows 11? Yes, but it requires a specific driver setting. By default, even at idle, the GTX 1080 can draw 15-25W. The key to achieving ultra-low idle power is forcing the card into its lowest power state (P8) when not in use. This has become more crucial with high-refresh-rate monitors, which often keep the card in a higher power state (P2). Open NVIDIA Control Panel: Go to "Manage 3D Settings." Set Power Management Mode: In the "Global Settings" tab, find the "Power management mode" option. Change it from "Normal" to "Prefer maximum performance". This seems counter-intuitive, but proceed to the next step. Apply and Change Back: Click "Apply." Now, change the setting back to "Normal" and click "Apply" again. This cycle often resets the driver's power state logic. (Crucial Step) Use NVIDIA Profile Inspector: Download this third-party tool. Under the "Global Profile," find the section "5 - Common." Set "CUDA - Force P2 State" to Off. This is the most important step to prevent the card from getting stuck in a mid-power state. Verify: Use a tool like GPU-Z to monitor power draw. At idle on the desktop, the "Board Power Draw" should now drop to between 7W and 10W. 🔍 DLDSR + DLSS on GTX 1080 – what's the image quality sweet spot? Myth-Busting: You CANNOT use DLSS on a GTX 1080. DLSS (Deep Learning Super Sampling) requires dedicated Tensor Core hardware, which the Pascal architecture lacks. Any guide claiming to enable DLSS on a GTX card is incorrect. It is likely confusing DLSS with DSR or FSR. What you can use is DLDSR (Deep Learning Dynamic Super Resolution). DLDSR is an advanced downscaling technique. It uses AI to render the game at a higher resolution (e.g., 2.25x your native 1080p) and then intelligently scales it down to your monitor's resolution. This provides a sharper image than traditional DSR with less of a performance hit. The Trade-Off: DLDSR is purely for image quality enhancement at the cost of performance. It is the opposite of an upscaler like FSR. On a GTX 1080, using DLDSR is only viable in older or less demanding games where your FPS is already very high (e.g., 120+ FPS in titles like CS:GO or League of Legends). For modern AAA games, the performance cost of DLDSR will make the game unplayable. It is not a practical feature for demanding titles on this hardware. 🔍 What are the real gains from hack-enabling Resizable BAR on a GTX 1080? Verdict: It's a myth. The gains are zero to negligible. Like the GTX 1660 Ti, the GTX 1080 and its Pascal architecture do not officially support Resizable BAR. While community tools can force the feature "on" in the driver, extensive testing by multiple outlets and users has confirmed that it provides no meaningful performance uplift. In a best-case scenario, benchmarks show a 1-2% improvement, which is well within the margin of error for testing. In some cases, it can even cause slight performance regressions or instability. The Pascal architecture lacks the necessary optimizations in its memory controller and VBIOS to take advantage of the full PCIe address space that ReBAR enables. While it's a fascinating technical experiment, it is not a viable method for extracting more performance from a GTX 1080. Your time is better spent on traditional overclocking and undervolting. GeForce GTX 1660 Ti: Unlocking Hidden Potential 🔍 How can I get a 40 FPS "Steam Deck-like" experience in Starfield on a GTX 1660 Ti? Starfield is notoriously demanding. To achieve a stable 40 FPS on a GTX 1660 Ti at 1080p, you need aggressive settings combined with custom .ini file tweaks. First, use the "Low" in-game preset and ensure FSR 2 is set to 75% resolution scale. Then, create a `StarfieldCustom.ini` file in `DocumentsMy GamesStarfield` and paste the following consolidated tweaks: [Display] fShadowDistance=10000.0000 fDirShadowDistance=10000.0000 [Grass] bAllowCreateGrass=0 fGrassStartFadeDistance=1000.0000 fGrassMaxStartFadeDistance=1000.0000 iMinGrassSize=0 [Decals] uMaxDecals=100 [Water] bUseWater=0 [General] bVolumetricLightingEnable=0 These settings drastically reduce shadow distance, disable intensive grass and water physics, and turn off volumetric lighting, targeting the biggest performance hogs to stabilize your frame rate. 🔍 Is using DSR on a GTX 1660 Ti worth it compared to native 1080p? Generally, no. DSR (Dynamic Super Resolution) renders the game at a higher resolution (e.g., 1440p) and downscales it to your monitor's native resolution (1080p), acting as a high-quality anti-aliasing method. You can enable it in the NVIDIA Control Panel under "Manage 3D Settings" > "DSR - Factors." However, this comes at a massive performance cost. The GTX 1660 Ti already struggles to maintain 60 FPS in many modern titles at native 1080p. Using DSR will drop your frame rate significantly, making most games unplayable. It's only a viable option in older, less demanding titles where you have extreme performance headroom (e.g., running at 150+ FPS) and want to trade those excess frames for better image quality. For modern gaming, it's not a practical feature for this card. 🔍 How do I fix the "RmInitAdapter failed" error on Linux with a GTX 1660 Ti? This is a known issue with NVIDIA drivers on Linux kernels 6.4 and newer, causing the GPU to fail initialization at boot. The fix is to add a kernel parameter to force GPU power management to be handled by the driver. Here is the one-page fix: Open a terminal. Edit your GRUB configuration file with a text editor: sudo nano /etc/default/grub Find the line that starts with GRUB_CMDLINE_LINUX_DEFAULT. Add nvidia.NVreg_PreserveVideoMemoryAllocations=1 inside the quotes. It should look something like this: GRUB_CMDLINE_LINUX_DEFAULT="quiet splash nvidia.NVreg_PreserveVideoMemoryAllocations=1" Save the file (Ctrl+O, Enter) and exit (Ctrl+X). Update GRUB to apply the changes: sudo update-grub Reboot your system. The error should be resolved. 🔍 Can I get real performance gains by forcing Resizable BAR on a GTX 1660 Ti? No, the gains are not real or meaningful. The GTX 1660 Ti's Turing architecture and VBIOS do not officially support Resizable BAR. While it's technically possible to use tools like NVIDIA Profile Inspector to find and enable the three hidden flags for ReBAR, community testing has shown this provides no tangible benefit on unsupported hardware. Any performance change is typically within the margin of error or, in some cases, can even lead to instability. Unlike modern RTX cards that are designed for it, the 1660 Ti lacks the foundational support to leverage this feature. It's an interesting experiment for tinkerers but not a practical performance enhancement. 🔍 What's the difference between undervolting a 100W laptop vs. a 120W desktop 1660 Ti? The goal of undervolting is different for each platform, leading to different target curves in MSI Afterburner. Desktop (120W) Goal: Maximize efficiency and reduce heat to maintain higher, more stable boost clocks. You have more thermal headroom. Focus on finding the lowest voltage for your desired high clock speed. Sample Target: Press Ctrl+F in Afterburner to open the curve editor. Find the point for 925mV and drag it up to ~1920MHz. Hit 'L' to lock the voltage and then 'Apply'. This often provides near-stock performance while cutting power draw by 15-20W. Laptop (100W or less) Goal: Prevent thermal throttling and reduce fan noise for a more consistent, quieter experience. You are severely limited by thermals and power. Focus on finding a moderate clock speed at a much lower voltage. Sample Target: Open the curve editor. Find the point for 850mV and drag it up to ~1785MHz. Lock the voltage. This will significantly reduce heat, stop the GPU from hitting its thermal limit, and provide a much more stable (if slightly lower peak) frame rate than the default fluctuating clocks. Radeon RX 5600 XT: Answering the Community 🔍 Does the RX 5600 XT support Resizable BAR on B450/B550, and how do you enable it? Yes, it does! AMD enabled Smart Access Memory (the branding for Resizable BAR) for RX 5000 series GPUs. Enabling it can provide a performance uplift of 5-10% in some games. Here’s the definitive step-by-step guide: Update Motherboard BIOS: Ensure your B450 or B550 motherboard has the latest BIOS version with AGESA 1.1.0.0 or newer. This is a mandatory first step. Enter BIOS/UEFI: Restart your PC and press the key to enter BIOS setup (usually Del, F2, or F12). Enable Above 4G Decoding: Navigate to "Advanced" or "PCI Subsystem Settings." Find and Enable the "Above 4G Decoding" option. This must be enabled first. Enable Re-Size BAR Support: After enabling the above, a new option, "Re-Size BAR Support" or "Smart Access Memory," should appear. Set this to Auto or Enabled. Disable CSM (if enabled): Resizable BAR requires a pure UEFI boot environment. In the "Boot" section of your BIOS, find the "Compatibility Support Module" (CSM) and ensure it is Disabled. Your OS must be installed in UEFI mode (which is standard for Windows 10/11). Save and Exit: Save your BIOS changes and reboot. Verify in AMD Software: Open the AMD Adrenalin Software. Go to Settings (gear icon) > Graphics. "Resizable BAR" should now show as Enabled. If not, re-check your BIOS settings. 🔍 Can the RX 5600 XT use FSR 3 Frame Generation to hit 90+ FPS? Officially, no. In reality, sometimes. AMD's official driver-level frame generation (AMD Fluid Motion Frames or AFMF) is limited to the RX 6000 series and newer. The RX 5600 XT does not support it. However, for games that implement FSR 3 natively (like *Forspoken* or *Immortals of Aveum*), the upscaling component works perfectly on the 5600 XT. The Frame Generation part is where it gets tricky. The Community Mod Solution: Enthusiasts have created mods that replace a game's DLSS 3 files with FSR 3 components, effectively enabling Frame Generation on non-RTX cards. By using these mods in games like *Cyberpunk 2077* or *The Witcher 3*, an RX 5600 XT can see significant FPS boosts, potentially pushing a 50-60 FPS baseline up to 80-90 FPS. However, be aware of the trade-offs: input latency will increase, and visual artifacts can occur. This is an unsupported, experimental solution for single-player games. 🔍 What are the best budget AM5 CPUs to pair with an RX 5600 XT in 2025? While the RX 5600 XT is a great match for AM4 CPUs like the Ryzen 5 3600 or 5600, it's still a capable 1080p card if you're planning a platform jump to AM5. To avoid overspending on a CPU that the GPU will bottleneck, here are the smartest budget AM5 pairings for 2025: Best Value: AMD Ryzen 5 7500F. This CPU (often available in specific markets or as a tray processor) offers nearly identical gaming performance to the Ryzen 5 7600 but lacks integrated graphics. Since you have a dedicated GPU, this is a perfect way to save money while getting modern Zen 4 performance. Best Overall: AMD Ryzen 5 7600 / 7600X. The go-to choice for budget AM5 gaming builds. Its strong single-core performance provides a rock-solid foundation for the RX 5600 XT and gives you massive headroom to drop in a much more powerful GPU in the future without needing to change your platform. Pairing with these CPUs ensures the RX 5600 XT is always the performance bottleneck, which is ideal. You get the full power of your GPU while having a modern platform ready for your next big upgrade. 🔍 BIOS-mod vs. driver tuning: How do I get the most memory bandwidth? This is a relic of the past. Modern driver tuning is the only way. The "VBIOS modding" discussion for the RX 5600 XT refers to a one-time, permanent flash needed for early models to unlock their full potential (14 Gbps memory speed). If you buy a used 5600 XT today, you should ensure it's a model that already runs at 14 Gbps (like the Sapphire Pulse or Gigabyte Gaming OC). There is no further BIOS modding required or recommended. All modern performance tuning—overclocking the GPU core and memory—should be done through the AMD Adrenalin Software under the "Performance" > "Tuning" tab. You can safely push the memory clock slightly beyond its stock 1750 MHz (14 Gbps effective) for a small bandwidth increase, but gains are marginal. The era of needing a special VBIOS for baseline performance is over. 🔍 What are the best Fortnite performance presets for the RX 5600 XT in Chapter 5? Fortnite's move to Unreal Engine 5 made it more demanding. Here are two optimized presets for the RX 5600 XT at 1080p to get a great experience in the current season: Setting Competitive (Avg. 144-160 FPS) Quality (Avg. 70-90 FPS) Rendering Mode Performance Mode DirectX 12 Anti-Aliasing Off TSR High View Distance Medium / Far Far Textures Low High Meshes Low High Nanite / Lumen N/A (Disabled) Off For competitive play, Performance Mode is non-negotiable. For a better-looking experience, the DX12 preset with TSR High provides excellent image quality while maintaining smooth gameplay. Final Recommendations For the Pure 1080p Gamer (Maximum FPS and Longevity) Winner: GeForce GTX 1080. Its 8GB VRAM buffer is the decisive factor, providing essential headroom for modern AAA games and ensuring a longer usable lifespan without compromising texture quality. For the Budget Streamer or Video Editor Winner: GeForce GTX 1660 Ti. Despite being the slowest in gaming, its modern Turing NVENC encoder offers superior stream quality and its CUDA performance provides reliable acceleration in professional video apps. For the Power & Efficiency-Conscious / SFF Builder Winner: GeForce GTX 1660 Ti. With a 120W TDP, it is by far the most efficient card, producing less heat and fitting into compact builds where the other two would struggle. The Best All-Rounder (Overall Recommendation) GeForce GTX 1080 In the unique context of the used market, the GTX 1080 is the most compelling package. It combines top-tier 1080p performance with the critical 8GB VRAM buffer. Its higher power draw is a manageable trade-off for significantly better longevity and visual fidelity. The king may be old, but it has aged far more gracefully than its 6GB competitors. Affiliate Disclosure: Faceofit.com is a participant in the Amazon Services LLC Associates Program. As an Amazon Associate we earn from qualifying purchases. Share What's your reaction? Excited 0 Happy 0 In Love 0 Not Sure 0 Silly 0
Tech Posts Qualcomm’s Dragonwing Q-6690: First Enterprise SoC with Integrated RFID The launch of the Qualcomm Dragonwing™ Q-6690 isn’t just another product release; it’s a strategic ...
Tech Posts List of the Best MacBook Pro Charger: Guide to Safe Charging Tool Choosing a new USB-C charger for your MacBook Pro can be a minefield of confusing ...
Tech Posts Kunlun Glass vs Gorilla Glass Victus 2 & 5: Compare durability Test That heart-stopping moment when your phone slips from your grasp is a feeling every smartphone ...
Tech Posts ONEXGPU 2 Guide: OCuLink vs USB4 Performance, Compatibility & Setup The ONEXGPU 2 promises a desktop-class gaming experience for your handheld PC, but unlocking that ...
Tech Posts PC Repair USB Guide: Hiren’s BootCD PEMedicat USBSergei Strelec’s WinPE When your PC won’t boot, the situation can feel hopeless. But before you call an ...
Tech Posts Echo Show 5 & Matter: The Ultimate Compatibility Guide (2025) The promise of Matter was a simple, unified smart home. Yet, many Echo Show 5 ...
Liquid Lens Periscope Zoom: The 2026 Revolution for Foldable Phone Cameras IGJuly 31, 2025 Tech Posts