Sunday, 25 October 2015

Running with multiple graphics cards for compute and gaming

I have a small but ostensibly conflicting set of requirements for my desktop computer:
  1. General use - keep the power low
  2. 3D modelling /  graphics - exploit GPU as much as practical for rendering in particular
  3. Video / gaming / live grahics - use my best GPU
The CPU is an i7 with intel 4000 series GPU, so more than adequate for general messing about. I have a GeForce 660-Ti and recently was donated a GeForce 670 as well. Happily my PSU can support both cards in use.

I also know from previous messing about with Blender that using the GPU for compute (rendering) that is also driving the screen is not a good plan, so the quest was on to find a way of using the Intel graphics to mostly drive the screen, while having the option of using the GeForce 670 for modest gaming, and with the ability to use Intel graphics to run Blender using both graphics cards for compete while rendering.

The current working method to achieve this is not perfect, but from my perspective quite usable.

Motherboard / bios settings

Motherboard set to allow both integrated and discrete graphics to be active. Without this setting either the Intel GPU or the GeForce GPUs will not be available to the OS. Also integrated (Intel) graphics is set the the default graphics option. This means that the bios and grub bootloader (I run dual boot linux / windows) run through the Motherboard graphics port.


PC / Monitor connections

My monitor (Dell U2713H) has multiple digital inputs - in particular 2 DisplayPort inputs and an HDMI input.

I use DisplayPort to connect the Motherboard to the monitor (I cannot run my monitor at native resolution from the integrated graphics over HDMI, it's fine over DisplayPort)

I connect the GeForce 670 to the monitor using HDMI.


Windows Settings

I have windows set to extend the desktop with the DisplayPort (HD4000) connection as the primary display, and HDMI (GeForce 670) as the second display positioned on the right.

Working method

Monitor set be default to use DisplayPort input. This setup is also set to use wide gamut colour and is colour calibrated.

To run games or other software that meeds to use the GeForce to drive the display, I set the game to run on the second screen (this only needs to be done once typically), and then immediately after launching the game, I switch the monitor input to the HDMI input.

Disadvantages

Windoze thinks that the second display (in normal mode) or the first display (in gaming mode) are really there and allows the mouse to roam onto that screen, which means it completely disappears. Really just a minor irritation once you are aware of what is happening.

Also the monitor has to be manually switched between the two modes, not really a problem, just a mild inconvenience.

Power

Adding the second GPU increases the idle load very slightly - around 10 watts or slightly less. Windows idling now runs at around 85 watts mains input.

Gaming (which uses the 670) increases this to around 180 watts (although a heavy game would probably use more).

Full on compute - such as ray trace rendering in blender takes power to around 320 to 330 watts.

Monday, 27 July 2015

Timelapse from a Raspberry Pi webcam

Just a few notes on taking very large numbers of photos with a view to making timelapse videos.

I've done a few timelapse videos using my DSLR, then processing them through Lightroom and LRTimeLapse. This work well, but is quite hard work, ties up my dslr and of course uses up a lot of shutter time (since I can't do electronic shutter with the dslr).

So I thought I'd have a go at making it happen with a webcam and a raspberry pi (I already had the webcam from other PC things I'd been doing, or else I might have used a picam).

I'm using a Logitech C920 webcam and planning to take full HD sized photos.

  1. Kit in use & setup
  2. Taking the photos - fswebcam
  3. And the answer is python and v4l2?
  4. First tests to check performance.

Sunday, 21 December 2014

Better display quality, printing and the quest for 10 bit colour support

Since I got a nice display (dell U2713H) and got it properly set up I have a good reliable photography workflow on Windows, and with a little work I have the monitor working nicely on Ubuntu as well (This particular monitor requires a pixelclock fix to enable Ubuntu to drive native resolution through the nVidia drivers - see below the break).

My hardware setup is:
  • Asus P8Z77-V LE motherboard
  • Intel i7-3770
  • nVidia GTX 660 Ti graphics card -> display port / hdmi
  • -or- 
  • sometimes Intel 4000 graphics -> display port
  • Dell U2713H monitor
I use D S Colour labs for all my colour printing and can get pretty well perfect match between print preview in Lightroom and the resulting print viewed in a viewing box. However even with Lightroom / Photoshop I still sometimes see banding while working on photos  and it is easy to show that this is often due to the 8 bits / colour / pixel that is normal for today's computers.

(To see this just make a new image in an image editor and fill it with a gradient with a fairly restricted range of grays, such as from 55 - 65. I was surprised when I first tried this just how obvious the effect is.)

So I started looking around to see if I could get at the 10 bits / channel that my monitor is capable of.

At first it appears that the only way to do this is to by a professional workstation graphics card, but these are dramatically more expensive than their consumer equivalents. Then I found that the nVidia Linux drivers are fully 10 bit capable (usually referred to as 30 bit color) on all relatively recent cards!

Monday, 17 November 2014

Just a quick build for a client to play nice music

Now that I have minimserver working well and have found sensible playback software, here's a quick note on a basic build for a client to act as a upnp renderer with optional control and chrome installed for netflix access.

Saturday, 15 November 2014

Maybe the time has finally come to do the media server thing

I've been keen on the idea of a media (primarily music) server for a long time, but the completely turd like control facilities have always been a show stopper. Mainly 'cos as I have a fair sized collection of classical music, the lack of ability to handle the distinction between composer, performer, orchestra, soloist etc. - at least without spending a fortune on Linn or other proprietary kit.

At one time I seriously considered a fully diy solution, but that was too big a commitment.

My most recent foray was with plex, but this still didn't solve the indexing problem.

Then yesterday I bumped into minimserver. This looked like the sort of answer I was looking for. It looked like it properly handles the indexing and access.

What I want to do is:
  1. media on a headless server (probably a vm on my pet server)
  2. control agents preferably on android
  3. playback through windows or ubuntu pc's (or even android things)
Still early days, but things are looking promising so here is the log of my first steps.

Thursday, 23 October 2014

Shooting in raw pt 2 - Panasonic DMC-FZ1000

Continuing the story of raw vs jpeg, here are some photos from a Panasonic Lumix. The jpeg processing here is very different to the canon - and preserves more detail especially in the well exposed areas. But the opportunity to recover very dark and very bright areas is still greatly reduced when starting with a jpeg.

Wednesday, 22 October 2014

Yes you really should shoot raw, and this is why.....

I've just been looking at some photos taken by a friend from camera club, he was trying to find out if taking photos in raw really makes a difference compared to using jpeg. In short yes it does, and now I've looked at some photos, it makes even more of a difference than I had realised, not just more chance to see the highlights and dark areas, but more detail and better colour rendition as well.

While there may still be occcasions when jpeg is a good idea (fast auto repeat without filling the buffer, or just to take an enormous number of pictures with limited memory card size), use raw unless you have a  good reason not to.