Sunday, 25 October 2015

Running with multiple graphics cards for compute and gaming

I have a small but ostensibly conflicting set of requirements for my desktop computer:
  1. General use - keep the power low
  2. 3D modelling /  graphics - exploit GPU as much as practical for rendering in particular
  3. Video / gaming / live grahics - use my best GPU
The CPU is an i7 with intel 4000 series GPU, so more than adequate for general messing about. I have a GeForce 660-Ti and recently was donated a GeForce 670 as well. Happily my PSU can support both cards in use.

I also know from previous messing about with Blender that using the GPU for compute (rendering) that is also driving the screen is not a good plan, so the quest was on to find a way of using the Intel graphics to mostly drive the screen, while having the option of using the GeForce 670 for modest gaming, and with the ability to use Intel graphics to run Blender using both graphics cards for compete while rendering.

The current working method to achieve this is not perfect, but from my perspective quite usable.

Motherboard / bios settings

Motherboard set to allow both integrated and discrete graphics to be active. Without this setting either the Intel GPU or the GeForce GPUs will not be available to the OS. Also integrated (Intel) graphics is set the the default graphics option. This means that the bios and grub bootloader (I run dual boot linux / windows) run through the Motherboard graphics port.


PC / Monitor connections

My monitor (Dell U2713H) has multiple digital inputs - in particular 2 DisplayPort inputs and an HDMI input.

I use DisplayPort to connect the Motherboard to the monitor (I cannot run my monitor at native resolution from the integrated graphics over HDMI, it's fine over DisplayPort)

I connect the GeForce 670 to the monitor using HDMI.


Windows Settings

I have windows set to extend the desktop with the DisplayPort (HD4000) connection as the primary display, and HDMI (GeForce 670) as the second display positioned on the right.

Working method

Monitor set be default to use DisplayPort input. This setup is also set to use wide gamut colour and is colour calibrated.

To run games or other software that meeds to use the GeForce to drive the display, I set the game to run on the second screen (this only needs to be done once typically), and then immediately after launching the game, I switch the monitor input to the HDMI input.

Disadvantages

Windoze thinks that the second display (in normal mode) or the first display (in gaming mode) are really there and allows the mouse to roam onto that screen, which means it completely disappears. Really just a minor irritation once you are aware of what is happening.

Also the monitor has to be manually switched between the two modes, not really a problem, just a mild inconvenience.

Power

Adding the second GPU increases the idle load very slightly - around 10 watts or slightly less. Windows idling now runs at around 85 watts mains input.

Gaming (which uses the 670) increases this to around 180 watts (although a heavy game would probably use more).

Full on compute - such as ray trace rendering in blender takes power to around 320 to 330 watts.

Monday, 27 July 2015

Timelapse from a Raspberry Pi webcam

Just a few notes on taking very large numbers of photos with a view to making timelapse videos.

I've done a few timelapse videos using my DSLR, then processing them through Lightroom and LRTimeLapse. This work well, but is quite hard work, ties up my dslr and of course uses up a lot of shutter time (since I can't do electronic shutter with the dslr).

So I thought I'd have a go at making it happen with a webcam and a raspberry pi (I already had the webcam from other PC things I'd been doing, or else I might have used a picam).

I'm using a Logitech C920 webcam and planning to take full HD sized photos.

  1. Kit in use & setup
  2. Taking the photos - fswebcam
  3. And the answer is python and v4l2?
  4. First tests to check performance.