O cabo DVI em vez do VGA resolveu o problema.
Ubuntu 16.04, GeForce9600GT, driver NVIDIA 304.131, mesmo com 340.96, cabel VGA.
Não consigo definir a resolução 1680x1050 no driver NVIDIA (Erro de solicitação com falha: BadMatch (atributos de parâmetro inválidos))
xrandr:
Screen 0: minimum 8 x 8, current 1024 x 768, maximum 8192 x 8192
DVI-I-0 disconnected (normal left inverted right x axis y axis)
VGA-0 connected primary 1024x768+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
1024x768 60.00*+
1360x768 59.96 59.80
1152x864 60.00
800x600 72.19 60.32 56.25
680x384 59.96 59.80
640x480 59.94
512x384 60.00
400x300 72.19
320x240 60.05
DVI-I-1 disconnected (normal left inverted right x axis y axis)
HDMI-0 disconnected (normal left inverted right x axis y axis)
cvt 1680 1050:
# 1680x1050 59.95 Hz (CVT 1.76MA) hsync: 65.29 kHz; pclk: 146.25 MHz
Modeline "1680x1050_60.00" 146.25 1680 1784 1960 2240 1050 1053 1059 1089 -hsync +vsync
xrandr --newmode:
sudo xrandr --newmode "1680x1050_60.00" 146.25 1680 1784 1960 2240 1050 1053 1059 1089 -hsync +vsync
sudo xrandr --addmode VGA-0 1680x1050_60.00:
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 18 (RRAddOutputMode)
Serial number of failed request: 29
Current serial number in output stream: 30