This thread is good feedback. I think changing the default to a 1TB SSD is
a reasonable request.
Please send any future comments regarding hardware to Sophana (
s...@mozilla.com) to increase the chances that feedback is acted on.
On Wed, Nov 8, 2017 at 9:09 AM, Julian Seward
On 08/11/17 17:28, Boris Zbarsky wrote:
> The last desktop I was shipped came with a 512 GB drive. [..]
>
> In practice, I routinely run out of disk space and have to delete
> objdirs and rebuild them the next day, because I have to build
> something else in a different srcdir...
I totally
I’d like to add the VM multiplier: I’m working mainly on OSX and run a Windows
and a Linux VM in there with their own checkouts and objdirs. Instead of
allocating a comfortable size virtual disks, I end up resizing them quite
frequently to avoid running out of space to save as much as possible
On 11/7/17 4:13 PM, Sophana "Soap" Aik wrote:
Nothing is worse than hearing IT picked or chose hardware that nobody
actually wanted or will use.
If I could interject with a comment about the hardware we pick...
The last desktop I was shipped came with a 512 GB drive. One of our
srcdirs is
Thanks Jeff, I understand your reasoning. 14 cores vs 10 is definitely
huge.
I will also add, there isn't anything to stop us to having more than one
config, just like we do with laptops.
I'm fortunate to be in this situation to finally help you all have
influence on the type of hardware that
Kris has touched on the many advantages of having a standard model. From
what I am seeing with most people's use case scenario, only the GPU is what
will determine what the machine is used for. IE: VR Research team may end
up only needing a GPU upgrade.
Fortunately the new W-Series Xeon's seem to
With all this talk…
I’m eagerly waiting for the iMac Pro.
Best of all worlds really:
- High core count
- ECC RAM
- 5K 27” display
- Great graphic card
- Super silent…
I’ve been using a Mac Pro 2013 (the trash can one), Xeon E5 8 cores, 32 GB ECC
RAM, connected to two 27” screens (one 5K with
On Wed, Nov 08, 2017 at 09:43:29AM +0200, Henri Sivonen wrote:
> I agree that workstation GPUs should be avoided. Even if they were as
> well supported by Linux distro-provided Open Source drivers as
> consumer GPUs, it's at the very least more difficult to find
> information about what's true
I agree that workstation GPUs should be avoided. Even if they were as
well supported by Linux distro-provided Open Source drivers as
consumer GPUs, it's at the very least more difficult to find
information about what's true about them.
We don't need the GPU to be at max spec like we need the CPU
The Core i9s are a quite a bit cheaper than the Xeon Ws:
https://ark.intel.com/products/series/125035/Intel-Xeon-Processor-W-Family vs
https://ark.intel.com/products/126695
I wouldn't want to trade ECC for 4 cores.
-Jeff
On Tue, Nov 7, 2017 at 3:51 PM, Sophana "Soap" Aik
If you don't want to get into the weeds on ECC again, please do not
reinitiate discussion. I do not agree that "the additional cost of ECC
is very low compared to the cost of developer time over the two years
that they're expected to use it", but I will restrict my disagreement
to the forked
On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:
On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik wrote:
Hi All,
I'm in the middle of getting another evaluation machine with a 10-core
W-Series Xeon Processor (that is similar to the 7900X in terms of clock
Hi All,
I'm in the middle of getting another evaluation machine with a 10-core
W-Series Xeon Processor (that is similar to the 7900X in terms of clock
speed and performance) but with ECC memory support.
I'm trying to make sure this is a "one size fits all" machine as much as
possible.
Also
My understanding of current policy is that ECC is not required. (and
not even an option with MacBook Pros) Given the volume of development
that happens unhindered on our developers' many, many non-ECC
machines, I believe the burden of proof-of-burden is on the pro-ECC
argument to show that it's
On 06/11/2017 22:44, Jeff Gilbert wrote:
> Price matters, since every dollar we spend chasing ECC would be a
> dollar we can't allocate towards perf improvements, hardware refresh
> rate, or simply more machines for any build clusters we may want.
And every day our developers or IT staff waste
Price matters, since every dollar we spend chasing ECC would be a
dollar we can't allocate towards perf improvements, hardware refresh
rate, or simply more machines for any build clusters we may want.
The paper linked above addresses massive compute clusters, which seems
to have limited
> On Nov 6, 2017, at 05:19, Gabriele Svelto wrote:
>
>> On 04/11/2017 01:10, Jeff Gilbert wrote:
>> Clock speed and core count matter much more than ECC. I wouldn't chase
>> ECC support for general dev machines.
>
> The Xeon-W SKUs I posted in the previous thread all had
On 04/11/2017 01:10, Jeff Gilbert wrote:
> Clock speed and core count matter much more than ECC. I wouldn't chase
> ECC support for general dev machines.
The Xeon-W SKUs I posted in the previous thread all had identical or
higher clock speeds than equivalent Core i9 SKUs and ECC support with
the
Thank you for including an AMD card among the ones to be tested.
- -
The Radeon RX 460 mentioned earlier in this thread arrived. There was
again enough weirdness that I think it's worth sharing in case it
saves time for someone else:
Initially, for multiple rounds of booting with different
Clock speed and core count matter much more than ECC. I wouldn't chase
ECC support for general dev machines.
On Thu, Nov 2, 2017 at 6:46 PM, Gregory Szorc wrote:
> On Thu, Nov 2, 2017 at 3:43 PM, Nico Grunbaum wrote:
>
>> For rr I have an i7 desktop with
On Thu, Nov 2, 2017 at 3:43 PM, Nico Grunbaum wrote:
> For rr I have an i7 desktop with a base clock of 4.0 Ghz, and for building
> I use icecc to distribute the load (or rather I will be again when bug
> 1412240[0] is closed). The i9 series has lower base clocks (2.8
For rr I have an i7 desktop with a base clock of 4.0 Ghz, and for
building I use icecc to distribute the load (or rather I will be again
when bug 1412240[0] is closed). The i9 series has lower base clocks
(2.8 Ghz, and 2.6Ghz for the top SKUs)[1], but high boost clocks of 4.2
Ghz. If I were
Thanks Gabriele, that poses a problem then for the system build we have in
mind here as the i9's do not support ECC memory. That may have to be a
separate system with a Xeon.
On Fri, Oct 27, 2017 at 3:58 PM, Gabriele Svelto
wrote:
> On 27/10/2017 01:02, Gregory Szorc wrote:
Thank you Henri for the feedback.
How about this, we can order some graphics cards and put them in the
evaluation/test machine that is with Greg, to make sure it has good
compatibility.
We could do:
Nvidia GTX 1060 3GB
AMD Radeon RX570
These two options will ensure it can drive multi displays.
Hello everyone, great feedback that I will keep in mind and continue to
work with our vendors to find the best solution with. One of the cards that
I was looking at is fairly cheap and can at least drive multi-displays
(even 4K 60hz) was the Nvidia Quadro P600. I feel especially based on the
work
BTW can someone forward this entire thread to their friends at AMD so AMD
will fix their CPUs to run rr? They're tantalizingly close :-/.
Rob
--
lbir ye,ea yer.tnietoehr rdn rdsme,anea lurpr edna e hnysnenh hhe uresyf
toD
selthor stor edna siewaoeodm or v sstvr esBa kbvted,t
On 28/10/2017 01:08, Sophana "Soap" Aik wrote:
> Thanks Gabriele, that poses a problem then for the system build we have
> in mind here as the i9's do not support ECC memory. That may have to be
> a separate system with a Xeon.
Xeon-W processors are identical to the i9 but come with more
Yeah. Only the Xeons and ThreadRipper (as our potential high core count
machines) support ECC. rr, ECC, or reasonable costs: pick at most two :/
On Fri, Oct 27, 2017 at 4:08 PM, Sophana "Soap" Aik
wrote:
> Thanks Gabriele, that poses a problem then for the system build we have
On 27/10/2017 01:02, Gregory Szorc wrote:
> Sophana (CCd) is working on a new system build right now. It will be based
> on the i9's instead of dual socket Xeons and should be faster and cheaper.
... and lacking ECC memory. Please whatever CPU is chosen make sure it
has ECC support and the
Not necessarily relevant to this specific discussion, but I'm on a
Lenovo P50 running Linux, and wanted to offer up my setup as a
datapoint. (It's not quite either a recommendation or a word of warning.
A combination.)
I use Linux (Fedora 25) as the host OS, with two external monitors plus
On Fri, Oct 27, 2017 at 2:34 AM, Henri Sivonen wrote:
> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>
Turns out that rr not working with Nvidia on Ubuntu 17.10 was actually an
rr issue triggered by the Ubuntu libc
On Fri, Oct 27, 2017 at 4:48 AM, Sophana "Soap" Aik wrote:
> Hello everyone, great feedback that I will keep in mind and continue to work
> with our vendors to find the best solution with. One of the cards that I was
> looking at is fairly cheap and can at least drive
On Thu, Oct 26, 2017 at 4:31 PM, Mike Hommey wrote:
> On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> > Also, the machines come with Windows by default. That's by design: that's
> > where the bulk of Firefox users are. We will develop better products if
> the
On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> Also, the machines come with Windows by default. That's by design: that's
> where the bulk of Firefox users are. We will develop better products if the
> machines we use every day resemble what actual users use. I would encourage
>
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc wrote:
> I also share your desire to not issue fancy video cards in these machines
> by default. If there are suggestions for a default video card, now is the
> time to make noise :)
Intel GPUs are the best choice if you want to be
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc wrote:
> Unless you have requirements that prohibit using a
> VM, I encourage using this setup.
rr doesn't work in hyper-v. AFAIK the only Windows VM it works in is VMWare
-Jeff
___
On Thu, Oct 26, 2017 at 6:34 AM, Henri Sivonen wrote:
> On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen
> wrote:
> > There's a huge downside, though:
> > If the screen stops consuming the DisplayPort data stream, the
> > graphical session gets killed!
On 10/26/2017 06:34 AM, Henri Sivonen wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at
On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2"
Yeah. I'd suggest anyone who's running Linux on these machines just go
out and buy a $100 AMD GPU to replace the Quadro. Even if you don't
expense the new GPU and just throw the Quadro in the trash you'll
probably be happier.
-Jeff
On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen
On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen wrote:
> There's a huge downside, though:
> If the screen stops consuming the DisplayPort data stream, the
> graphical session gets killed! So if you do normal things like turn
> the screen off or switch input on a multi-input
On Thu, Mar 23, 2017 at 3:43 PM, Henri Sivonen wrote:
> On Wed, Jul 6, 2016 at 2:42 AM, Gregory Szorc wrote:
>> The Lenovo ThinkStation P710 is a good starting point (
>> http://shop.lenovo.com/us/en/workstations/thinkstation/p-series/p710/).
>
> To help
42 matches
Mail list logo