PC Review


Reply
Thread Tools Rate Thread

Video Card for Video editing, replacement question

 
 
Benjamin Gawert
Guest
Posts: n/a
 
      26th Jul 2007
* Ken Maltby:

>>> Some of this depends on what kind of "Video Editing" you are talking
>>> about. Most editing will not need 3D accelerations. A good 400+
>>> ramdac would be the main feature I would use to pick a card for that
>>> purpose,

>> That's only relevant if you want to use a CRT (or a very low-end TFT
>> display without DVI).
>>
>>> ( I know, with digital, the "Dac" part isn't as critical,

>> The RAMDAC simply isn't used for digital connection.
>>
>>> but it can
>>> still be a real indication of how well the card was designed.)

>> Since the RAMDAC is integrated into the GPUs for almost a decade now this
>> is just nonsense.
>>
>> Benjamin

>
> Did you enter this thread just to snipe at me?


No, but you wrote nonsense and I took freedom to correct it. Is that a
problem for you?

> You should know
> that current cards ( and for sure older ones) support Analog VGA
> as well as Digital output modes. For instance the ATI HD 2600XT
> lists as one of its features: "Two integrated 400 Mhz 30-bit
> RAMDACs".


Exactly, "integrated" (means: built into the GPU).

> They also have SD and HD analog output support
> with an "Integrated AMD Xilleon HDTV encoder".


That has nothing to do with the RAMDAC.

> But my point was that a card with a 400+ RAMDAC (integrated
> within the GPU or not), would be all that you need look for, to
> select a video card for editing video.


You should sometimes read what you write. You wrote:

"I know, with digital, the "Dac" part isn't as critical, but it can
still be a real indication of how well the card was designed."

This is utterly BS because *all* modern gfx cards have the RAMDAC built
in the GPU so the RAMDAC says *nothing* about "how well the card was
designed". Besides that you completely ignore that ATI/AMD and Nvidia
are designing reference designs which are used by the majority of board
maker.

> A cheap card with a good
> RAMDAC will do the job, all the 3D gaming accelerations are of
> no special benefit for most video editing. The use of a 400Mhz or
> above RAMDAC marks a true quality point for such cards.


That's BS, too. The 400MHz says *nothing* about the quality. It is the
measure of the signal bandwith and tells someone who knows that stuff
what resolution/refresh rate limits the RAMDAC can output.

The signal *quality* however is only dependent on the output filters
which are required for EMI compliance and on most current cards are
cheap types which limit the bandwidth and causes signal degradation.

> Go back to trying to get people to use Intel VM Motherboards
> with "Integrated GFX" , and stop trying to confuse things.


I'd recommened you get the basics first instead of showing everyone that
you have no clue about this stuff.

Benjamin
 
Reply With Quote
 
 
 
 
Ken Maltby
Guest
Posts: n/a
 
      26th Jul 2007

"Benjamin Gawert" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)...
>* Ken Maltby:
>
>>>> Some of this depends on what kind of "Video Editing" you are talking
>>>> about. Most editing will not need 3D accelerations. A good 400+
>>>> ramdac would be the main feature I would use to pick a card for that
>>>> purpose,
>>> That's only relevant if you want to use a CRT (or a very low-end TFT
>>> display without DVI).
>>>
>>>> ( I know, with digital, the "Dac" part isn't as critical,
>>> The RAMDAC simply isn't used for digital connection.
>>>
>>>> but it can
>>>> still be a real indication of how well the card was designed.)
>>> Since the RAMDAC is integrated into the GPUs for almost a decade now
>>> this is just nonsense.
>>>
>>> Benjamin

>>
>> Did you enter this thread just to snipe at me?

>
> No, but you wrote nonsense and I took freedom to correct it. Is that a
> problem for you?
>
>> You should know
>> that current cards ( and for sure older ones) support Analog VGA
>> as well as Digital output modes. For instance the ATI HD 2600XT
>> lists as one of its features: "Two integrated 400 Mhz 30-bit
>> RAMDACs".

>
> Exactly, "integrated" (means: built into the GPU).
>
>> They also have SD and HD analog output support
>> with an "Integrated AMD Xilleon HDTV encoder".

>
> That has nothing to do with the RAMDAC.
>
>> But my point was that a card with a 400+ RAMDAC (integrated
>> within the GPU or not), would be all that you need look for, to
>> select a video card for editing video.

>
> You should sometimes read what you write. You wrote:
>
> "I know, with digital, the "Dac" part isn't as critical, but it can
> still be a real indication of how well the card was designed."
>
> This is utterly BS because *all* modern gfx cards have the RAMDAC built in
> the GPU so the RAMDAC says *nothing* about "how well the card was
> designed". Besides that you completely ignore that ATI/AMD and Nvidia are
> designing reference designs which are used by the majority of board maker.
>
>> A cheap card with a good
>> RAMDAC will do the job, all the 3D gaming accelerations are of
>> no special benefit for most video editing. The use of a 400Mhz or
>> above RAMDAC marks a true quality point for such cards.

>
> That's BS, too. The 400MHz says *nothing* about the quality. It is the
> measure of the signal bandwith and tells someone who knows that stuff what
> resolution/refresh rate limits the RAMDAC can output.
>
> The signal *quality* however is only dependent on the output filters which
> are required for EMI compliance and on most current cards are cheap types
> which limit the bandwidth and causes signal degradation.
>
>> Go back to trying to get people to use Intel VM Motherboards
>> with "Integrated GFX" , and stop trying to confuse things.

>
> I'd recommened you get the basics first instead of showing everyone that
> you have no clue about this stuff.
>
> Benjamin



You seem incapable of reading my posts, and comprehending
what is being said. It might help if you read complete sentences
and paragraphs. Placing your own interpretation on my posts may
make you feel good, but is no help in the process of communication.

Again, a manufacture's use of a design that includes a RAMDAC
400Mhz or over shows that, the card was built to a high quality, in
the areas that relate to video editing. The RAMDAC's contribution
to that quality is more than you appear to allow, but it is only one
part of the design. My point, again, was that if a card has this area
of the design addressed, it need not include the more expensive 3D
acceleration features that are added to GPU and Video Card
designs (for gaming), to be an excellent card for Video Editing.
This makes it a good screening factor during the process of
selecting a video card for video editing, The subject of this
thread.

An inexpensive card can be lacking 3D gaming accelerations
but still be very good for Video Editing, if it is designed to the
quality that would include a 400Mhz or better RAMDAC.

You appear not to have any interest in the subject of this thread,
and I have no plans to hijack the thread to get into a technical
debate to correct your misunderstandings of the issues involved.
Now hack away at my sentences and paragraphs, I'm done with
you.

Luck;
Ken


 
Reply With Quote
 
 
 
 
Benjamin Gawert
Guest
Posts: n/a
 
      30th Jul 2007
* Ken Maltby:

> You seem incapable of reading my posts, and comprehending
> what is being said. It might help if you read complete sentences
> and paragraphs. Placing your own interpretation on my posts may
> make you feel good, but is no help in the process of communication.
>
> Again, a manufacture's use of a design that includes a RAMDAC
> 400Mhz or over shows that, the card was built to a high quality, in
> the areas that relate to video editing. The RAMDAC's contribution
> to that quality is more than you appear to allow, but it is only one
> part of the design. My point, again, was that if a card has this area
> of the design addressed, it need not include the more expensive 3D
> acceleration features that are added to GPU and Video Card
> designs (for gaming), to be an excellent card for Video Editing.
> This makes it a good screening factor during the process of
> selecting a video card for video editing, The subject of this
> thread.


And all this is still utterly BS.

First, the RAMDAC bandwidth only limits the available resolutions and
refresh rates for the VGA port and the analoge part of the DVI-I port.
It says exactly *zero* about the signal quality. The parts that mainly
influence signal quality (which affects image quality) are the output
filters and that's it. The signal quality of a 300MHz RAMDAC is as good
as the signal quality of a 450MHz RAMDAC, period.

Second, as RAMDACs are integrated into the GPU for almost a decade now,
the bandwidth of the RAMDAC tells you *nothing* about "how well a card
was designer [for video editing]", period. A manufacturer orders a
certain type of GPU, there is no choice for RAMDAC bandwith. Low end
GPUs often come with 300MHz RAMDACs while midrange and highend GPUs come
with 400+MHz RAMDACs.

Third, as the majority of card makers use reference designs by ATI/AMD
and Nvidia the difference between gfx cards of different board makers
are usually limited to GPU/memory clock rate, cooling (different heat
sinks/fans), different analog otput filters, different memory modules,
different PCB colours, the goodies (i.e. games, adapters, software) that
come with the card and finally the price.

> An inexpensive card can be lacking 3D gaming accelerations
> but still be very good for Video Editing, if it is designed to the
> quality that would include a 400Mhz or better RAMDAC.


And this also is just plain BS. Maybe you first should learn what a
RAMDAC does:

(old but still valid):
<http://grafi.ii.pw.edu.pl/gbm/matrox/ramdac.html>

As to the subject (video editing): the RAMDACs have absolutely *nothing*
to do with video editing. RAMDACs are a part of the GPU to drive
analogue monitors and nothing more. All other functions (like overlay
planes which is used also by video applications) are part of the GPU
itself and have also *nothing* to do with the RAMDACs. Even worse, when
video editing is done with two digital monitors (which is usually the
case today) the RAMDACs do exactly *nothing*.

> You appear not to have any interest in the subject of this thread,
> and I have no plans to hijack the thread to get into a technical
> debate to correct your misunderstandings of the issues involved.


I'm indeed very interested in the subject. And the reason for me
answering to this thread is not to attack you (I don't give a **** who
you are) but simply because you try to sell your bullshit here. This
group is here for helping each other, and spreading your nonsense
doesn't help the original poster one bit. The BS is of the same quality
as what DaveW usually spreads around, the only difference is that your
posts are longer.

> Now hack away at my sentences and paragraphs, I'm done with
> you.


Yeah, whatever. Please get at least some basic knowledge before
answering any questions. Telling stories when you don't know ****
doesn't help anyone.

Benjamin
 
Reply With Quote
 
J. Clarke
Guest
Posts: n/a
 
      31st Jul 2007
> I guess it could be overheating, but how would I resolve that? The
> fan is spinning and the heat within the computer is within an
> acceptable range.
>
> I'm sure it's the video card since I placed it into my wife's computer
> as a test. After about an hour,


If it's after an hour then it's likely a problem with the heat sink.
Pull it off, clean it. blow all the dust off it, add some fresh heat
sink compound, and replace it and see if the problem goes away. If it
doesn't, you've lost the two and a half bucks or so that a tube of heat
sink compound costs, if it does then you've saved the price of a new
board.

> it developed the same fuzzy lines for
> the other computer. Also, I'm using her video card in my computer and
> I don't have any problems with it.
>
>
>
> On Mon, 23 Jul 2007 20:30:08 GMT, "Michael W. Ryder"
> <(E-Mail Removed)> wrote:
>
>> Ziggs wrote:
>>> Currently, I have a 3GIG system that I put together 1.5 years ago.
>>> I have a ATI Radeon 9800 pro 128 AGP card, but it's failing (tried
>>> it in two computers and fuzzy lines start to appear on the model
>>> from time to time).
>>>
>>> Anyway, I want to buy the cheapest replacement video card for now
>>> because I'll get a new computer within 6 months or so.
>>>
>>> So, my main processing needs is when I use Pinnacle to edit clips
>>> for DVD's. So, can I get away with a cheaper card for this need?
>>> If so, what card would you recommend? I really don't want to waste
>>> money on a video card that I'll only need for 6 months, but I don't
>>> want to add an additional wait time for editing videos.
>>>
>>> TIA

>>
>> Are you sure that the video card is failing and not overheating?
>> Also have you checked to be sure that the power supply is not
>> failing?


--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)


 
Reply With Quote
 
 
 
Reply

Thread Tools
Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Any way to install replacement instance of Outlook on replacement =?Utf-8?B?amF6enluZG4=?= Microsoft Outlook Installation 2 19th Jan 2006 04:51 PM
Event Log Replacement Parameters and WriteEntry() - Are replacement parameters deprecated? Mike Microsoft Dot NET Framework 1 21st Oct 2004 10:50 PM
Replacement CD (Windows 2000 Pro Replacement CD) Joe Microsoft Windows 2000 4 29th Dec 2003 11:06 PM
Replacement CD (Windows 2000 Replacement CD) Joe Microsoft Windows 2000 2 29th Dec 2003 11:04 PM
Help with video card replacement, please. Lewis Campbell ATI Video Cards 0 29th Oct 2003 12:45 PM


Features
 

Advertising
 

Newsgroups
 


All times are GMT +1. The time now is 08:58 PM.