Laptop for architecture student

P

Pimpom

I need help in choosing a laptop for my son who will soon be off
to college to study for a degree in architecture. I'm fairly
knowledgeable about desktop hardware, especially AMD platforms.
But I don't know much about laptops.

I won't ask what would be the best laptop for an architecture
student because I can't afford a high-end model. Besides, I have
to question the wisdom of shelling out a small fortune now, only
to see it become rapidly demoted long before my son completes the
5-year course. I think it makes more sense to buy somethinhg that
would be reasonably adequate for the next few years, and then buy
another contemporarily adequate model during the latter half of
his studies. He could then continue to use that until he starts
earning himself.

I don't have a strict budget because I have no clear idea of what
would be "adequate" for an architecture student. I'd like to stay
within ~US$600 unless that's a ridiculously low budget. (Based on
US prices. The actual price here will be higher).

Having said all that, unless there are serious flaws in my logic
(for which I'm quite ready to be corrected), what do you advise?
An i5 system? Which level? What about AMD APUs (A8 or A10)? RAM
and other factors? Graphics? Specific laptop models? Thanks in
advance for any constructive input.
 
P

Paul

Pimpom said:
I need help in choosing a laptop for my son who will soon be off
to college to study for a degree in architecture. I'm fairly
knowledgeable about desktop hardware, especially AMD platforms.
But I don't know much about laptops.

I won't ask what would be the best laptop for an architecture
student because I can't afford a high-end model. Besides, I have
to question the wisdom of shelling out a small fortune now, only
to see it become rapidly demoted long before my son completes the
5-year course. I think it makes more sense to buy somethinhg that
would be reasonably adequate for the next few years, and then buy
another contemporarily adequate model during the latter half of
his studies. He could then continue to use that until he starts
earning himself.

I don't have a strict budget because I have no clear idea of what
would be "adequate" for an architecture student. I'd like to stay
within ~US$600 unless that's a ridiculously low budget. (Based on
US prices. The actual price here will be higher).

Having said all that, unless there are serious flaws in my logic
(for which I'm quite ready to be corrected), what do you advise?
An i5 system? Which level? What about AMD APUs (A8 or A10)? RAM
and other factors? Graphics? Specific laptop models? Thanks in
advance for any constructive input.

Simple. Visit the web page of the architecture department
of the university being attended. They may already have
a document defining minimum student computer requirements.

search terms : architecture department recommended laptop

You'd be surprised how much effort, and how large an IT department
some universities have now. And how locked-in to computers they are.

*******

http://www.pvamu.edu/Include/architecture/pdf/soa_laptop_requirements.pdf

December 13, 2006 (seven years ago)

Core 2 Duo T7400 2.00GHz
Windows XP Professional, Windows Vista Capable
17 inch Wide Screen XGA Display or better
2GB Shared Dual Channel RAM
(usual storage...)
Video card
256MB or greater
OpenGL capable <-------------
1280x1024x32-bit color

The item that defines the laptop there, is a certified CAD capable
OpenGL graphics capability. Now, unfortunately, the graphics companies
choose to hobble regular desktop graphics. If you attempt to do complex
OpenGL with "ordinary" desktop graphics, the graphics subsystem seems to
"choke" at a relatively low item count (say, 50 items). Screen updates
slow right down, when they should not do that. This is done on purpose
(so you end up paying for a more expensive solution).
In many cases, there is little functional difference between
a graphics chip with CAD quality drivers, and the driver they provide
for the cheaper desktop graphics.

By the time you identify a CAD machine, you've already seen a rapid rise
in price. And the rest of the goodies "come for free", once you've paid
for "certified driver" and "CAD capable".

At a guess, an architect draws things in a drawing capture tool (3D views).

But they also do structural analysis, and the CAD tools that do that,
prove a design is strong enough, or earthquake proof, those would
be examples of CAD-type tools. For example, if you do a skyscraper,
you place impulse or sinusoidal excitation near the base, to
see how the building wobbles (and adsorbs shock without failure).
You would need to learn that, for cities that have earthquake
requirements in the building code.

While it is tempting to say "buy a quad or an octal core processor",
the thing is, software varies a lot in the extent to which it supports
threading. You would need to survey the software used, to get some idea
whether extra cores would help.

As an example, my desktop is a dual core. Many times I've hungered for
a quad or larger, but the thing is, lots of the software I use, really
runs on one core, and the second core handles disk I/O and background
OS activity. So while I'd "feel better" if I had a quad, in a lot of
cases two of those cores would only be generating heat.

My laptop with one core, is a mistake. It's just too slow for any
purpose. Just the background activity slows that one down (I noticed that
right after installing a webcam and printer driver). You want
at least a dual core with a high clock, or a quad core with a moderate
clock, to help with the sins of modern OSes. OSes like Windows 8, are
always "doing maintenance for themselves", so you can count half-a-core
just to keep the OS (and AV) happy.

It's hard to find good sites which discuss such OpenGL matters. Not every
user is proficient at understanding what is holding back their
computing solution. But I've heard enough stories about "stuttering",
and seen bad benchmark results with my own hardware, to know it happens.

http://archicad-talk.graphisoft.com/viewtopic.php?p=141182

"I've tried about 18 cards in the last year... anything that has
512mb ddr2 will probably do well, GPU core considered.

If you do a lot of large models, with glass and Utilities
(sinks, toilets, rounded GSM objects) and tree models (we're talking
a lot of polygons) then the FASTER the card, the better.

I've found that the consumer cards (I.E. cheaper) work great. AC
does NOT need much to run at all... my 8400 GS in my portable laptop
(1240x800) runs fine, but stutters on big models. The 8600 GT can
handle anything so far. My 7900 GT/GS and GTX's can handle everything.
The firegl x1700 and up can do anything, as well as anything in the
professional realm (but you don't need a pro card for AC!!!!)."

The certified OpenGL solutions have brand names like Quadro and FireGL.
If you want to take a chance on ordinary desktop GPUs (non-certified
drivers), then ask a professor if that will be suitable or not.

To give you an example at the other end of the spectrum, we spent
maybe $15K to $20K to provide a mechanical engineer with an OpenGL
capable machine. One day, he gave me a demo. His model loaded 200K
models (not polygons, a model might be a screw for example)
from the corporate database, with a couple million polygons
(many occluded). And the poor guy could barely rotate the model
in 3D on his screen. I was shocked, at how all that money on hardware
was still not sufficient to help a professional do his job. I figured
he'd be able to spin that model at 10 revolutions per second on
a big hardware box. But the thing was a slug. He designed a ton of
mechanical parts for us, and to see him have to walk away from his
desk and find something else to do, is a sad commentary on computing.
You load entire models like that, to check for interferences, that
doors on things open without binding, do tolerance analysis. Sure, he
could load tiny portions of the model, and the computer would fly,
but that spoils the whole purpose of visualization in a virtual 3D
environment. Might as well go back to pencil and paper, as wait
seven hours for a model to load. And if you only load a portion
of a model, you might miss some details along the way.

To summarize:

1) Get the computer requirements from the architecture department.
2) Shop for something OpenGL capable. Look for words like Quadro or FireGL.
3) Computer should have at least dual core CPU. And if you go dual core,
the highest possible clock on a dual core.

It really depends, on how insistent the department is, that the
students be "mobile". For example, if I was buying, I might buy a
super cheap "note-taker" for a laptop, and buy a more capable
desktop (with room for second-hand Quadro or FireGL card), and
do my homework that way. If the staff insist the students drag
"workstation grade" laptops around with them, that's going to
cost a lot more.

If you're buying a workstation grade machine, get some
insurance for it :) Theft is a problem in universities,
especially near the end of the school year. Mobile devices
disappear real easy.

Paul
 
M

miso

I'm not buying your OpenGL story. Just exactly who is crippling OpenGL?
Just run Google Earth with directX and openGL and see if you detect a
difference. The video cards support OpenGL and DirectX to the best they
can. Nobody is going to make OpenGL worse for shits and giggles.

OpenGL used to be terrible. Unfortunately for Apple, that was all they
had, so they dumped a lot of time and money into OpenGL to the benefit
of the open source community. [I rarely praised those Apple tax cheating
bastards.] You can even play games these days in openGL and mesa. OpenGL
these days is OK. DirectX is better, but it only runs on windows.

If you look at AutoCAD (the most bloated cad their is), it looks like
they bifurcated the task into basic and workstation. Probably no
notebook will handle the biggest jobs, but a basic notebook will run
autocad.

If you are on a budget and don't mind a bit of work, you can get a Dell
with 2G of RAM, then stuff it to the max for very little money. I
wouldn't run a notebook these days with less than 8Gbytes. The reason I
suggest Dell is they have extensive documentation, so finding out how
much RAM the mobo can use is pretty simple. The services manuals are online.

You will probably want to find a notebook that can do a decent job
driving an external display. That will take a bit of research. At least
HD if not 1920x1200. It doesn't pay to get a really high res display in
a notebook. You will be much happier using a large monitor.

If you really want to have fun, figure out a few suitable models, then
search the dell outlet. They sell returned notebooks. Since Dells are
configurable, they are often configured strangely and so buyer just
returns them and pays a fee.

The Dell notebook I used came with 64 bit Win 7 pro and 3G of RAM. Dumb.
It had a puny disk (80G). But the guts was a mil grade ATG like they use
in the police squad cars. I got it cheap (including the OS on DVD) put
in a SSD and stuffed it with 8GByte of RAM. The notebook wasn't ever
used by the original owner. It was brand stinkin' new.

The thing with the Dell outlet is the inventory is constantly changing,
so you need to check it periodically. Something like a Latitude is a
corporate buy, and corporations often screw up their orders.
 
P

Paul

miso said:
I'm not buying your OpenGL story. Just exactly who is crippling OpenGL?

You don't have to.

Research it for yourself.

I checked some forums, where users were comparing cards in an informal
manner. Many of the forum posts have to be discounted, for sloppy test
techniques. But there's a signal there.

Even articles like this, leave something to be desired. What you'd want,
is software with increasing polygon counts, to be run on the two
video cards, and then compare how they behave.

Compare ATI X1800XT to ATI FireGL V7350 - SpecViewPerf 48 on the FireGL,
SpecViewPerf 22 on the X1800

http://www.sudhian.com/content/?p=1255

And no, the clock isn't "twice as high" on the FireGL card. There
isn't the headroom to do that.

http://www.tomshardware.com/reviews/opengl-workstation-graphics,1269.html

"In order for product positioning to have the desired effect,
they use an "independent" label (FireGL instead of Radeon) and
modify the card's BIOS and a bit of microcode in the chip. This
means that users cannot run FireGL drivers on a Radeon card, and
vice versa. The drivers themselves have built-in "artificial brakes",
meaning that a gaming card can never achieve decent values in the
OpenGL arena - that's reserved for workstation cards. As an aside,
both ATi and Nvidia use the same process to position their individual
products."

I'm surprised these techniques, still work. Something like this
was done years ago, and allowed driver comparisons to be done.
(The resistor technique described here, is the same technique used
on HDaudio, to pass a four bit code on a single pin. Inside the
chip, a crude ADC converts an analog value on the pin, back into
digital form. That's why the resistor value makes a difference.)

http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/

http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/

While on some modern cards, like Fermi or Tesla, there can be
a caching feature that's enabled on the expensive cards, on the
older stuff there is much less hardware difference between the desktop
version of a chip and the certified OpenGL one. They don't run off
and do special chips for the low-volume OpenGL cards. It costs too much
to generate separate SKUs, to be doing that. Just like for some
of the Intel stuff, the Xeon (non-SMP) and desktop can be quite
similar, with perhaps a 2:1 difference on a TLB or something
(larger TLB on Xeon or Opteron, versus a desktop counterpart).
All set by strap options when manufacturing. Otherwise, the chips,
when viewed under a microscope, would be very similar if not identical.
Many different packaged part SKUs, are made from a common silicon die design.

So the differentiation comes from the driver in this case.

And a sloppy design technique, is to make the driver loading
only depend on the VEN/DEV of the card. So the GPU is only
varied enough, to change the VEN/DEV it reports when probed.

Probably the best example, would be if you track down one
of the cards where the experimenters figure out what pin to
strap, to make a FireGL card out of a desktop card, then
compare performance with the FireGL driver in place. That
would give the best comparison of "driver effect". I tried
looking for one of the older articles where this was done,
but can't find it now.

Paul
 
M

miso

Except this has nothing to do with a laptop, especially a budget laptop.
As I clearly showed, AutoCAD (oink oink) has very minimal requirements
for the lower tier. For the upper tier, a notebook won't be used.

Go to the NAHB show and look at the software demos. They use quite
ordinary hardware. Or do a survey of the software other than AutoCAD. No
balls are being busted.
 
M

~misfit~

Somewhere said:
Except this has nothing to do with a laptop, especially a budget
laptop. As I clearly showed, AutoCAD (oink oink) has very minimal
requirements for the lower tier. For the upper tier, a notebook won't
be used.
Go to the NAHB show and look at the software demos. They use quite
ordinary hardware. Or do a survey of the software other than AutoCAD.
No balls are being busted.

The 'p' versions of the IBM ThinkPads had FireGL GPUs fitted whereas the
vanilla versions had Radeon GPUs.

I don't know if Lenovo ThinkPads still do a similar thing - they're a much
diluted version of the original IBM ThinkPads by all reports (and the little
hands-on xpeience I've had with them).
--
/Shaun.

"Humans will have advanced a long, long, way when religious belief has a
cozy little classification in the DSM."
David Melville (in r.a.s.f1)
[Sent from my OrbitalT ocular implant interface.]
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top