How to disable integrated graphics?

Learn how you can disable and remove the integrated graphics card with ease. See what are the downsides of removing the iGPU (integrated Graphical Processing Unit) and how to fix them. Learn how to carefully and safely switch from an Nvidia to AMD graphic card. See how to avoid getting your display going completely black. If you are stuck anywhere then watch the youtube video on our Youtube video.

How to disable integrated graphics

Is it safe to disable integrated graphics?

The simple answer is NO since the screen can be completely cut off or go completely black. Your CPU will only be able to handle all the basic graphics on your system at best. However, if you are using a laptop then you will just get worse graphics since the CPU will handle the GPU’s work. You may get lower resolution along with slower animations and more battery consumption.

If you are using a PC then without a dedicated graphics card you may not be able to enable the dedicated graphic card option. Since the GPU is responsible for showing you the screen and anything related to visuals. If you remember the old computer with command lines and no visuals, that is what happens without a graphics card. In layman’s terms, we can say that no (integrated or dedicated) graphics card means no display. We would strongly suggest that you do not disable an integrated graphics card without a dedicated graphics card setup in its place. Also, there is no need to disable it since you can put your dedicated graphic card as default.

How do I permanently disable Intel HD graphics?

It is possible to permanently remove any dedicated graphics card from your system. However, you need to have a dedicated graphics card all set up before you remove the Intel HD graphics. Because if you remove the only graphic card then you will not have any display, which means you will get a blank screen. This will be the last thing you want, so you need to set up a dedicated graphics card first.

The second graphic card will simply replace the newly removed graphic card, and Windows will automatically adjust itself. Follow the steps to remove the Intel HD graphics:

  1. Boot up (Start) your system and right-click on “This PC” or “My computer”, a menu should open up. Click on “Manager” and a new window should open.
  2. In the new window click on “Device Manager” which should be located on the left side of the window.
  3. Now locate “Display Adapter” in the newly appeared list (icon of monitor with a green chip). Select the graphic card with the name of Intel HD graphics.
  4. Once you find it, right-click on the name of the card and then click on “Disable”. The screen will go blank for 2-5 seconds, this means Windows is adjusting the graphic settings. It should be back to normal automatically.
  5. (Optional step) Now go to “Add or Remove Programs”, search in the window’s search at the bottom. In the window, look for the name of your old integrated graphic (Intel) and uninstall it. This will prevent any conflicts from occurring in the future if you use a dedicated graphics card.

How do I switch from NVIDIA to AMD GPU?

This is pretty simple as long as you follow the steps carefully, and make sure you have the AMD dedicated graphics card with you. The first thing we need to do is, remove all the Nvidia graphic drivers and other software from your system. You can download Guru 3D’s Display Driver Uninstaller for removing the driver effortlessly and safely, click here to download.

Follow the instructions given in the Guru app and disconnect your internet to avoid any complications. The Window will try to download the driver again if you leave the internet connection, this is why it’s important to disconnect. Now follow the steps to shift from Nvidia to AMD:

  1. Download and install AMD drivers from here.
  2. Disconnect the internet and reboot your system in safe mode. Once in safe mode, run DDU (Guru Display Driver Uninstaller) and remove the Nvidia drivers.
  3. After removing Nvidia drivers, reboot your system to complete the removal.
  4. Now shut down your system and replace the physical graphic cards from Nvidia to AMD. Make sure all the cables are connected properly and power on properly.
  5. Once done, boot your system and install the AMD driver, after installing reboot your system.
  6. After completing the installation, connect your system to the internet and enjoy the new AMD graphic card.

How to disable integrated graphics?

The integrated graphics card should not be disabled if you do not have a dedicated graphics card installed then it can result in several problems. There can be several conflicts between both the graphic cards which will result in total graphic failure. Now we can disable the integrated graphic card in 2 different ways, both are mentioned below. The first way is via the BIOS (Basic Input-Output System), BIOS is responsible for booting up your system. The second way is via Windows itself, in Windows we will use Add/Remove software, Device Manager, etc.

NOTE: Chances are your BIOS will not be able to disable integrated graphics since BIOS is dependent on the motherboard. Not all motherboards support advanced features such as tempering with graphic cards and overclocking.

Method 1 (Using BIOS)

This method may not work for everyone since not all bios support this functionality. You may be able to see the option but it could be disabled by default. However, you can search online regarding the model and brand of your motherboard to see if there’s a workaround. If you are not able to find the option then move to the second method.

Follow the steps to disable the integrated graphic card via BIOS:

  1. Power on your system (or restart) and press the button for BIOS which should be at the bottom of the screen. The key combination varies for different motherboards, but a common key combination is the F10 or the Delete key. Other keys can be the F1 key, F2 key, F10 key, F12 key, and Delete key. (If you miss the key then reboot the system again.)
  2. Once the BIOS menu opens, the BIOS menu will look very old and oddly colored (you will be able to identify it without any issue). Look for “Integrated Peripherals Section” and under it the setting with the name “integrated graphic” or “onboard” or “VGA”. Click on it and a new window/menu will open.
  3. In the menu look for “integrated graphics” and click on “enable”, it should change to “disabled” or “off”. If the click is not working then check how you can change it, it could be the cycle menu.
  4. Once it changes, click on “Save” which should be at the bottom, and then exit the BIOS. You should be able to find all the buttons such as exit, save, and more at the bottom of the screen. If the system is for confirmation then give it with the Y key or by clicking Yes.

Method 2 (Using Device Manager)

This will work for all the readers as long as you are using Windows OS only and not any version of UNIX. Follow the steps to disable the integrated graphic card:

  1. Boot up (Start) your system and right-click on “This PC” or “My computer”, a menu should open up. Click on “Manager” and a new window should open.
  2. In the new window click on “Device Manager” which should be located on the left side of the window.
  3. Now locate “Display Adapter” in the newly appeared list (icon of monitor with a green chip). Select the graphic card with the name of Intel or AMD, keep in mind to only select the integrated graphic card and not the dedicated one. If you are not sure then search about the system online to check the name of the integrated graphic card.
  4. Once you find it, right-click on the name of the card and then click on “Disable”. If you see any confirmation or warning message then click “Yes” or “Ok” and then check if the status is changed and now showing “enable”.
  5. Now go to “Add or Remove Programs”, search in the window’s search at the bottom. In the window, look for the name of your old integrated graphic (Intel or AMD) and uninstall it. This will prevent any conflicts from occurring in the future if you use a dedicated graphics card.
    NOTE: Make sure you do not delete any important software, search the name on Google of the software before removing it.

Does disabling integrated graphics improve performance?

Yes disabling the integrated graphics card will give a huge boost to the performance of your CPU and other places where system performance matters. HOWEVER, you need the integrated graphics card if you want to use your system like normal unless you have a dedicated graphics card. The advantages of disabling the integrated graphics are mentioned below:

  1. It will immediately make your system faster by using a dedicated graphics card instead of an integrated graphic card. This will take the load off your CPU.
  2. The fan over your CPU will slow down since it is used a lot less, which will make your system less noisy.
  3. Since integrated graphics use the normal RAM instead of a VRAM, this will take the load off of your RAM. You will feel like you can do more tasks than before, and play games more smoothly.

Do not miss:

How do I switch from integrated graphics to graphics cards?

The process is really simple, follow the steps mentioned below:

  1. Power off your system and plug in the graphics card, make sure all the cables are connected properly.
  2. Boot up your system, download and install the drivers for your graphics card. The screen will go black from time to time while installing, which means that the driver is adjusting.
  3. Once installed, power off your system and connect the monitor cable to the graphics card instead of the CPU. The GPU will have HDMI female input.
  4. Now power on your system and check if everything is working properly. Go to Device Manager -> Display Adapter -> Check for the driver name.
  5. You can change which graphic card you want to use from any app’s settings. For example, games ask you in the display settings about the graphic card you want to use. Simply switch iGPU (integrated) with GPU.

Summary

In the article, we saw the importance of a dedicated graphics card, how to disable integrated graphics, and why we should not disable it without enabling a dedicated graphic driver first. If you are not sure about integrated vs dedicated graphics cards then know that you need a dedicated GPU to do intense graphical work. This can include gaming, video editing, making any kind of computer-generated imagery like animation, etc.

We would strongly discourage removing the driver, disabling is enough to make your system faster or make a dedicated GPU as primary GPU. If you still remove the driver then basicknowledgehub is not responsible for any damages since all the dangers are mentioned in the above article. Go through the steps carefully to avoid any damage or problem with your system due to drivers. Watch youtube videos of installing the physical graphics card to avoid damaging them.

Basicknowledgehub
error: Content is protected !!