WHY

  • Thread starter Thread starter Guest
  • Start date Start date
Harlan

the way i would join to a table 30 times is thru OLAP

I would have a simple time dimension and drag and drop the months on the
horizontal axis.. and i would have the principal (or interest rate; or
anything else you need) on the vertical axis and then have a simple formula
to calculate the value at the intersection.

(1 + percent)^(48/12)

That would be the formula-- it would be easy to do.
 
I just meant anything .NET was a lot faster than previous C++ and VB and
JScript BS.


And I'm not short-sighted; i started using spreadsheets 12 years ago AND I
DECIDED TO EXPAND MY HORIZONS.
 
yes, but it would be more efficient for you beancounters to grow into VBA
with Acess than VBA for Excel.

you can actually product useable reports with an awesome degree of
customization with Access and VBA.

VBA is the bomb.

I just got kindof burtn out on looping thru cells and looking for values.

There is a better way; it is called Access VBA
 
yes, I do expect you to believe that.

SQL Server on commodity hardware is always a better price-performance
proposition than being tied into one mainframe vendor.

i'm sorry that IBM wants you to change the OS every other year; I think that
is kindof comical

And, no... i analyzed billions of records with subsecond response times
using this technology called OLAP. recordcount becomes irrelevent once you
start storing aggregations in multi-dimensional structures.

And I do have experience with insurance and banks. I have quite a bit of
accounting and finance experience; I studied it in college; and I have
reported on financials for about a dozen clients.

I just know that most of these accounting and finance problems that you
think are _SO_COMPLEX_ are easily solveable using industry-standard
databases.

And everyone would be better off to stop using Excel and to start using
Access.

I know that the RDBMS solutions are a better way to analyze records and
build reports than using Microsoft excel.

You don't need to try to slam my experiences.. You're the one that is
obsolete kid
 
I have automated reading system logs in VB and SQL Server without a problem.

I have analyzed logs from hundreds of machines at Microsoft, specifically
for virus-tracking.

It is scalable, it is efficent.

And it is just blatantly more powerful than Perl and all those other BS
languages.

And I could do any of those things in SQL Server/ADP


Just because you don't know jackshit about databases; that doesn't mean that
I'm oversimplifying.

You're the preschooler using Excel, anyways

VB won the war, Microsoft is trying to push this C# BS down our throats..
and I'm not going to let it happen.

Microsoft is wrong
IBM is wrong

5 million VB developers can't be wrong

ps - when Computer Associates opens up the source to their VB program; it'll
make a comeback on *nix






yes, programmers are a waste of time.

!

Your browser and newsreader (not to mention your OS) were written in
SQL by DBAs?!
Most programming languages out there are obsolete.. C++ or C# or Perl-- give
me a break.

Ever tried to automate monitoring of system logs without using Perl or
some other scripting language? Definitely not fun!

Ever tried to write a device driver without using C or Assembler?
(Disclosure: I've written only one, following a cookbook procedure in
C, that provided a power on password with certain validation rules for
those passwords, and had to be loaded from CONFIG.SYS.)

If you're talking about generating reports, then I'd only agree that
most languages are overkill and not the best tool for the task (aside
from Perl and similar scripting languages summarizing system logs).
www.sqlmag.com has a new report that says that like 80% of the top database
people in the country use VB or VBA or VB.net.

Top database people? Which country? That sqlmag url redirects to

http://www.windowsitpro.com/SQLServer/

so do you really believe it'd cover all SQL systems or just SQL Server?
All OSs on which SQL RDBMSs run or just Windows? You really have a
narrow perspective, not to mention real problems with credulousness.

In the real world most database processing (in terms of bytes of data
processed) still takes place on mainframes. Ain't no stinkin' VB[*] on
mainframes, though mainframes could be ODBC data sources if a suitable
interface were implemented.

If you work for a small firms or one with a low transaction count, then
it's likely your firm doesn't use a mainframe. If so, you're still
suffering from the limitations of your own experience.

As for the web, check out Netcraft if you haven't already. Most web
sites run under Unix-like OSs like, e.g., Amazon. Ain't no stinkin'
VB[*] on Unix-like systems, and web servers aren't ODBC data sources
(though there may be tools to treat tables on html pages as relational
tables).

So how do you explain all those banks, insurance companies, securities
traders and brokers, government agencies that use mainframes? They just
don't employ any of the 'top database people'? Guess that must also
hold true for all the companies using Solaris, Linux, [*]BSD or even
Unixware and Mac OS X as the OSs for their web servers.

Grow up!
it's like a dead issue. VB won the war.

If working in a Windows-only shop, it may seem that way.
It is 3x easier to write somethign in ADP/MDB than it is in VB; or any other
language.

Anything?! How would you use ADP/MDB to write a simulation application
to model life insurance costs for a set of individuals of different
ages and risk characteristics? How would you use ADP/MDB to determine
the risk-based capital of Citibank? How would you use ADP/MDB to
optimize lift vs drag in wing designs? How would you even use ADP/MDB
to model traffic patterns to optimize traffic signal coordination and
timing in a given geographical area?

You are SO naive! Go back to writing database backends for your time
sheet reports.
 
Harald Staff wrote...
I know you're a clever guy, so your postings here can't possibly be written
to enlighten the OP. So on behalf of us readers, thanks for interesting
material.
....

Even the OP can get a clue, though the odds may be cosmically against
it.
 
(e-mail address removed) wrote...
I just meant anything .NET was a lot faster than previous C++ and VB and
JScript BS.
....

..Net can't be faster than tuned, well written C. It's arguable whether
it could be faster than well written C++ or VB compiled into native
harware machine code, at least as monolythic .EXEs rather than .EXEs
making lots of calls to DLLs. On balance it's a good idea, and
Microsoft hopes & prays it'll kill Java and JVMs.
 
(e-mail address removed) wrote...
yes, I do expect you to believe that.
??!

SQL Server on commodity hardware is always a better price-performance
proposition than being tied into one mainframe vendor.

mySQL or PostgreSQL running under Linux or [Free/Net]BSD on commodity
hardware is even better price-performance, which is why it's dominant
on the web. You're also ignoring (or ignorant of) sunk costs. Companies
that already have mainframes (owned or leased) and a stock of tested
mainframe software would find the price-performance of sticking with
existing procedures much better than change for change's sake alone.
i'm sorry that IBM wants you to change the OS every other year; I think that
is kindof comical

Change the OS? The mainframes I have worked with and still work with
have all run MVS (I also had a brief stint using CMS under VM). Are you
confusing MVS version updates with changing OSs? And Microsoft doesn't
expect the same thing to happen each time they release a new Windows
version? Or are you referring to mainframe versions of Linux, which are
almost always run under VM (which has been around for over two decades)
along with MVS and even DOS/360 for some real legacy stuff? If so,
Linux on mainframes is a replacement, and under VM runs along side
preexisting OSs.

So adding proof of your ignorance of mainframes to your demonstrated
ignorance of supercomputers and clusters. Well done!
And, no... i analyzed billions of records with subsecond response times
using this technology called OLAP. recordcount becomes irrelevent once you
start storing aggregations in multi-dimensional structures.
....

No it doesn't. You have possibly many CPUs. Each of those CPUs has a
max data throughput no faster than its clock speed. If you have 4 CPUs
running in parallel each with 2GHz clock speeds, the fastest you could
run through 1 billion records each of 100 bytes would be 3 1/3 seconds.
That assumes no OS overhead, no parallel processing synchronization
overhead, and 32-bit word throughput equal to clock speed. No way!

What would almost certainly be going on is caching of intermediate
results and indexing to skip unneeded records. Definite benefits to
that, but such benefits could be achieved by other means as well. But
the speeds you're reporting are *not* the result of querying the
underlying database.

That's the way Essbase and Applix TM/1 work, and (FWLIW) the way VP
Planner's 'Multidimensional Database' worked back in the late 1980s.
I just know that most of these accounting and finance problems that you
think are _SO_COMPLEX_ are easily solveable using industry-standard
databases.

Really? How do you use a database to hedge a commercial bond portfolio
against liquidity and call risk using interest rate derivatives? How do
you use databases to calculate risk based capital for banks or
insurance companies requiring estimates of confidence intervals of
statistical distributions? How do you use a database to decide between
alternative aircraft designs using different types of jet engines with
different thrust and fuel consumption characteristics? How do you use a
database to construct the optimal course schedule for a university for
a given term, where optimality is the measured as the greatest number
of students able to take the greatest number of their first choice
courses given class size and/or classroom seating constraints?

There really are real world problems that require more than counting
and totalling. No question databases would often be the ideal storage
subsystems for such applications, but most of the real work would need
to be done by procedural or functional code. Not necessarily
spreadsheets, but neither via SQL nor OLAP.
And everyone would be better off to stop using Excel and to start using
Access.

For database-like tasks, agreed as long as they have Access. In case
you've missed this point in previous messages, not every version of
Office comes with Access, but all come with Excel. One can only use the
tools one has.

For decidedly non-database tasks, it'd be pure foolishness to use
Access. Which is why you'd use it.
I know that the RDBMS solutions are a better way to analyze records and
build reports than using Microsoft excel.

Granted. However, to repeat yet again, not everything done in Excel
involves generating reports from records. It may be theoretically
possible to bludgeon any application's data into normalized tables and
create data structures from SQL queries, but it makes no more sense to
do so for most non-report spreadsheets than it would to use an RDBMS to
implement an bitmap drawing package or compressed file archiver.
You don't need to try to slam my experiences.. You're the one that is
obsolete kid

So far you've demonstrated that you don't know

1. what an amortization table is,
2. what calculation-intensive software (what's run on supercomputers
and clusters) is,
3. how to structure currency conversion rate tables,
4. anything about financial services businesses,
5. what MDX really provides (as opposed to what you believe it
provides) compared to Excel,
6. any statistics beyond what an average is,
7. any DBMS other than SQL Server under Windows,
8. any real time systems,
9. details of the history of the dot.com bubble and bust,
10. what people outside the IT departments really do beyond generating
regular, repetitive reports.

Maybe you have business experience beyond writing database reporting
apps that any other moderately competent DB programmer could have
written, but you haven't demonstrated any subject knowledge, not even
any detailed SQL knowledge (your pathetic attempt at a rough
description of how you'd put together an amortization table in a DBMS
was particularly indicative of the shallowness and narrowness of your
capabilities).
 
(e-mail address removed) wrote...
but what happens if you're using a spreadsheet; and you're sitting pretty at
40k rows and finally your company comes out with a bunch of new products and
changes a bunch of management.. they say that they need this report to
include salesperson and commission.

you dont KNOW when you're going to hit the 64k limit; so it's in your best
interests to avoid _ANY_ handicapped client.

You're confusing amortization tables (something spreadsheets do easily
and well, databases awkwardly and poorly) with sales management reports
(something databases do much better than spreadsheets). For the latter,
no doubt and no disagreement that it's best suited to databases.

In the case of the former, the number of rows needed is VERY
PREDICTABLE. 64K rows is adequate to handle *DAILY* loan payments for
more than 179 years or *HOURLY* loan payments for more than 7 years.
It's a testament to your extreme naivete about this sort of financial
analysis that you can't even realize 64K rows are so far beyond what's
needed that this constraint would ever be a problem.

Hello! There are real world problems that require data that changes in
value frequently but remains constant in volume. Yet another problem
domain in which you've trotted out your endless ignorance for everyone
to see.

Currency exchange rates are another example. The number of currencies
may change, but there are nowhere close to 255 heavily traded
currencies. No one tries to make money on the Zimbabwean, Turkmenistani
or North Korean currencies aside from the poor souls who live in those
countries.
you can't forecast your needs-- which is why it is MUCH more efficient to
develop using MSDE (don't worry its' just like SQL Server, but you can put
it on your laptop)

Maybe *you* can't forecast *your* needs (it's hard being at the bottom
of the latter, ain't it?), but there are LOTS of business decisions
that involve relative handfuls of data but lots of intricate
calculations. In a different response I mentioned deciding which engine
to use in a new aircraft design. That's an application with certainly
no more than a dozen options (different jet engines), with each option
having several but hardly more than a few dozen key criteria (price,
weight, thrust, fuel consumption, expected serviceable 'lifetime',
maintenance costs to name a few for jet engines). This is something
that is certainly better suited to spreadsheet rather than database
implementation because the number of calculations (as I define
calculation) is more than an order of magnitude greater than the number
of data points, and the number of data points is on the order of a few
hundred.

Another shock for you: those who are really valuable to their companies
tend to do lots with little bits of data.
so that you can grow to millions of records without stressing out when your
boss asks you to change your report.

If the bulk of my job were writing reports this might concern me. And
whatcha know, I've already agreed that if that were the case I'd be
using a DBMS. But that's not what I do, and I don't use databases all
the time, but I do use spreadsheets frequently (along with a lot of
other tools). You conclude from this that I don't know what tool to use
(DBMS for EVERYTHING!), but it's really just an indication that you
have no idea what people like me do. The only way you can hide the
imensity of your own ignorance from yourself is to turn up the volume
on your one-size-fits-all DBMS diatribe. Pitiful!
I just don't like how it is impossible to get all of the logic out of a
spreadsheet-- it is impossible to find errors, it is impossible to GROW your
solution so that it can include other areas.

If you don't and seeming can't understand how something works, it's
easy to believe it can't work.

There are ways to pull the logic out of spreadsheets. Writing formula
listings in R1C1 address format to text files. Storing VBA modules as
plain text files and using standard revision control systems to track
changes in the source code. These make it pretty simple to locate most
errors. There are nasty errors that avoid detection, but that's true of
ALL software. (And, though I doubt you're interested, there have been
studies of error rates in spreadsheets compared to more tranditional
programming, and the error rates aren't significantly different. See
http://www.panko.com/SSR/HOME.HTM )

If the original design is sound (this may not be true of most
spreadsheets), it's not difficult to expand such models to accomodate
larger problem domains. At some point diminishing returns will kick in,
but not until data reaches a few thousand enties.
All you do with Excel is cut and paste; and I think that it's hilarious that
grownsup think that this is an efficent solution.

I can only accept that this is *your* entire expereince with Excel, so
*you* only know how to cut and paste in Excel. You've already amply
demonstrated you lack the experience to have seen anything else done in
Excel or the wit to dream up anything you haven't seen others do in
Excel.
 
uh i just use the checksum function

wtf why would i want to sort in an application, i can sort everything i need
on the database side.

it's like-- use the tools you have instead of trying to duplicate
functionality in the wrong tier.

i can do a bubble sort; anything like that.. these sorts are made a lot
easier in vb.net

aaron
 
AOL has lost what 22% of their subscribers in the past 2 years?

And just because you're a macgeek that doesn't mean that macs are prevelent

and web-based mail sux.. it is too slow; it isn't dependable.

there isn't a company in the world that has made it reliable enough to USE

(technically i got termed from MSN for bitching about hotmail
crashing/erroring out EVERY FRIGGIN DAY)

i do hope that someday google comes out with a hosted database; based on
mysql

they're the one company in the world that could actually pull this off.

but for now-- using Excel to pull data from a database; and putting all of
your business logic in each copy of the report you make-- it is just
comical.

how do you manage your spreadsheets?

how can you TELL when you have a problem where one cell is pointing to the
wrong column?

i just don't think that it is possible to test a spreadsheet and determine
if it is accurate.

i believe it is quite easy to make queries in access or stored procs or
views in sql server-- you pass them a parameter if you need; the bottom line
is that you can reuse one set of numbers in multiple places; instead of
needing to copy them aroud all the time.

i am glad that you're TRYING to stick up for this obsolete app.

hopefully people will keep on MIS-using databases; and there will be plenty
of work for me in the future LoL
 
yeah, like it takes a doctorate to explain IBM's family of operating
systems.

the bottom line is that they don't have consistent offerings; they are
confusing; they are not consistent.

they are the biggest waste of intellectual capital of all time.

i had read that 7 out of 10 CS graduates worldwide goes to work for IBM.

I just think that it is ridiculous that they have 20 different operating
systems.

And re: Windows OS; i only know of ONE Windows-- that is Windows 2000. I
haven't used 9x since uh.. 95.. NT workstation was always preferable to me--
since it was stable; and it was fast.

Windows 2003 is a different version of the same OS-- it's not like you need
to rewrite everything in order to upgrade a webserver to 2003.

2003 has some AWESOME performance and security benefits.

and from what I've seen with databases; there is no point in keeping a dozen
mainframe administrators around in order to write RPG reports and bs like
that.

I don't argue that Microsofts' approach is FLAWLESS. But it is a much
better VALUE than anything else on the market-- even mySql costs a crapload
of money for commercial use.

MSDE-- Microsoft Data Engine-- this is a truly free database backend; and
the next version is going to clean up market share like you've never seen
before.




(e-mail address removed) wrote...
yes, I do expect you to believe that.
??!

SQL Server on commodity hardware is always a better price-performance
proposition than being tied into one mainframe vendor.

mySQL or PostgreSQL running under Linux or [Free/Net]BSD on commodity
hardware is even better price-performance, which is why it's dominant
on the web. You're also ignoring (or ignorant of) sunk costs. Companies
that already have mainframes (owned or leased) and a stock of tested
mainframe software would find the price-performance of sticking with
existing procedures much better than change for change's sake alone.
i'm sorry that IBM wants you to change the OS every other year; I think that
is kindof comical

Change the OS? The mainframes I have worked with and still work with
have all run MVS (I also had a brief stint using CMS under VM). Are you
confusing MVS version updates with changing OSs? And Microsoft doesn't
expect the same thing to happen each time they release a new Windows
version? Or are you referring to mainframe versions of Linux, which are
almost always run under VM (which has been around for over two decades)
along with MVS and even DOS/360 for some real legacy stuff? If so,
Linux on mainframes is a replacement, and under VM runs along side
preexisting OSs.

So adding proof of your ignorance of mainframes to your demonstrated
ignorance of supercomputers and clusters. Well done!
And, no... i analyzed billions of records with subsecond response times
using this technology called OLAP. recordcount becomes irrelevent once you
start storing aggregations in multi-dimensional structures.
...

No it doesn't. You have possibly many CPUs. Each of those CPUs has a
max data throughput no faster than its clock speed. If you have 4 CPUs
running in parallel each with 2GHz clock speeds, the fastest you could
run through 1 billion records each of 100 bytes would be 3 1/3 seconds.
That assumes no OS overhead, no parallel processing synchronization
overhead, and 32-bit word throughput equal to clock speed. No way!

What would almost certainly be going on is caching of intermediate
results and indexing to skip unneeded records. Definite benefits to
that, but such benefits could be achieved by other means as well. But
the speeds you're reporting are *not* the result of querying the
underlying database.

That's the way Essbase and Applix TM/1 work, and (FWLIW) the way VP
Planner's 'Multidimensional Database' worked back in the late 1980s.
I just know that most of these accounting and finance problems that you
think are _SO_COMPLEX_ are easily solveable using industry-standard
databases.

Really? How do you use a database to hedge a commercial bond portfolio
against liquidity and call risk using interest rate derivatives? How do
you use databases to calculate risk based capital for banks or
insurance companies requiring estimates of confidence intervals of
statistical distributions? How do you use a database to decide between
alternative aircraft designs using different types of jet engines with
different thrust and fuel consumption characteristics? How do you use a
database to construct the optimal course schedule for a university for
a given term, where optimality is the measured as the greatest number
of students able to take the greatest number of their first choice
courses given class size and/or classroom seating constraints?

There really are real world problems that require more than counting
and totalling. No question databases would often be the ideal storage
subsystems for such applications, but most of the real work would need
to be done by procedural or functional code. Not necessarily
spreadsheets, but neither via SQL nor OLAP.
And everyone would be better off to stop using Excel and to start using
Access.

For database-like tasks, agreed as long as they have Access. In case
you've missed this point in previous messages, not every version of
Office comes with Access, but all come with Excel. One can only use the
tools one has.

For decidedly non-database tasks, it'd be pure foolishness to use
Access. Which is why you'd use it.
I know that the RDBMS solutions are a better way to analyze records and
build reports than using Microsoft excel.

Granted. However, to repeat yet again, not everything done in Excel
involves generating reports from records. It may be theoretically
possible to bludgeon any application's data into normalized tables and
create data structures from SQL queries, but it makes no more sense to
do so for most non-report spreadsheets than it would to use an RDBMS to
implement an bitmap drawing package or compressed file archiver.
You don't need to try to slam my experiences.. You're the one that is
obsolete kid

So far you've demonstrated that you don't know

1. what an amortization table is,
2. what calculation-intensive software (what's run on supercomputers
and clusters) is,
3. how to structure currency conversion rate tables,
4. anything about financial services businesses,
5. what MDX really provides (as opposed to what you believe it
provides) compared to Excel,
6. any statistics beyond what an average is,
7. any DBMS other than SQL Server under Windows,
8. any real time systems,
9. details of the history of the dot.com bubble and bust,
10. what people outside the IT departments really do beyond generating
regular, repetitive reports.

Maybe you have business experience beyond writing database reporting
apps that any other moderately competent DB programmer could have
written, but you haven't demonstrated any subject knowledge, not even
any detailed SQL knowledge (your pathetic attempt at a rough
description of how you'd put together an amortization table in a DBMS
was particularly indicative of the shallowness and narrowness of your
capabilities).
 
Look buddy, C has been dead for 20 years; and if you're a C or C++
developer; you can go and spend 6 months trying to write a website LoL

It's just a different language-- nobody else in the whole wide world knows
how to make a decent IDE.. and the idea of allocating memory and wasting
time-- that is so 1993-- i mean.. VB takes care of all of that.

It is the most popular language in the world.

Do you honestly think that C or C++ is competive from a developers'
perspective?
 
the way i would join to a table 30 times is thru OLAP

It'd be 360 or 361 times for a 30 year loan with monthly payments.
I would have a simple time dimension and drag and drop the months on the
horizontal axis.. and i would have the principal (or interest rate; or
anything else you need) on the vertical axis and then have a simple formula
to calculate the value at the intersection.

And dragging and dropping the time dimension differs from how it'd be
done in spreadsheets how, exactly?

Let's say you have N+1 periods along the time dim'n (0..360, so N=360),
and you have another dim'n with principal paid, interest paid and
principal balance entries/items/categories, whatever the term. Each
'intersection' is equivalent to a spreadsheet cell. Principal paid is
always simple.
(1 + percent)^(48/12)

That would be the formula-- it would be easy to do.

Not quite.

In order to calculate the entries in the amortization table, you'd need
to start with these parameters: initial loan amount (LoanAmount),
effective periodic interest rate (InterestRate) and number of periods
(N). The periodic payment can be derived as

Payment = LoanAmount * InterestRate / (1 - (1 + InterestRate)^-N)

Presumably you'd need a 3rd dim'n containing LoanAmount, InterestRate,
N as a derived value (max value of time dim'n) and LoanAmount as a
derived value. It'd be nice if N could be a user entry and the time
dim'n adapt to that entry. Perhaps that's possible using VBA to adjust
the time dim'n after the user enters N.

Since there are no payments at time zero, one needs to treat time zero
as a special case. Then the intersections of the time and principal
paid dimensions would be something like

IIF(Period > 0, Payment / (1 + InterestRate)^(N - Period + 1), 0)

And interest paid would be a function of principal paid in the same
period,

IIF(Period > 0, Payment - PrincipalPaid(Period), 0)

But principal balance would be a function of principal paid in the same
period *and* principal balance from the previous period,

IIF(Period > 0, PrincipalBalance(Period - 1) - PrincipalPaid(Period),
LoanAmount)

My syntax may not be correct since I don't use MDX, but you can confirm
whether OLAP allows the principal balance calculation, basing the value
in any time period in part on its value in the preceding time period.
 
Hi aaron
you still have to consider that VB is not available on all platforms (and
probably won't be in the future). No disagreement that VB is a greta tool
(especially for GUIs) but this is only one specific application type. You
won't use VB for data processing (especially on *nix and mainframe
platforms). And these will be available for a very long time

Re website development: I doubt you'll develop that in VB anyway :-)
 
(e-mail address removed) wrote...
....
RE: "No matrix arithmetic functions. "

MDX stands for multiple-dimensional expressions.. if you're really trying to
claim that i can't build a 2-dimensional matrix and do whatever i want-- I
can build matrixs with 127 different dimensions.. i can build a matrix--
that instead of just going up/down and left/right-- i can go in 127
different directions

Stick with 2D matrices for now. How would you invert one? How would you
multiply two conformant ones? How would you calculate the determinant
of one? Those Excel functions aren't supported. Neither are LINEST or
LOGEST.

As for higher dimensions, I'd prefer to write my tensors in Mathematica
or MatLab (or SciLab) than deal with individually indexed entries in
some hypergrid.
Can you build a 3-dimensional matrix in Excel?? A pivot
HAHAHAHAHAHAHAHAHAHAHAHAHAHAH get real kid

I don't use 3D (or higher D) arrays much in Excel. I do use
multidimensional data in Mathematica and APL, which are the more
appropriate tools for what I do. (Heck, I never made much if any use of
3D in 123 beyond relative worksheet addressing.)
you can't do a damn thing with more than 3 or 4 dimensions... i throw around
billions of records in 20 directions before breakfast every day.
....

ed speak: s/around/up/

Who needs to? Multiple dimensions aren't often needed or even useful,
at least not in what I do, and when they are, there are functional
relationships between the dimensions that make certain transformations
that reduce the number of dimensions optimal.

If you're summing and averaging sales by product by distribution
channel by region by quarter, then multiple dimensions (and caching)
make sense for something that simple. If you're doing any sort of trend
analysis, e.g., inventory control, you'd be pulling off slices in 1D
and 2D for analysis, at which point OLAP's benefits become secondary to
MDX's lack of built-in stats functions.
You can do any of that stuff that you can in excel thru either

VBA
SQL Server
or
MDX

Anyone that tells you otherwise is trying to sell you anything
....

How would you generate 1,000,000 random deviates from a lognormal
distribution with parameters mu = 8.5 and sigma = 2.0 using MDX? Yes,
this can be done in Excel, and no, it doesn't need to be stored in a
single column.
And here is a listing from SQL Server Books Online:
....

OK, it does seem to include most of them, though not the ones
previously mentioned. Odd that MDX includes Median as one of its own
functions but not Mode, since Mode is pretty darn simple in SQL.

Select Max(x)
From (Select Count(y) As x From Tbl Group By y);

So, can you pass any old OLAP object to any Excel function, or do you
have to extract individual numbers from OLAP objects before you can use
Excel functions? If the latter, that interface would necessarily slow
things down considerably.
And it's funny.. i looked up about linest.. I think that it is hilarious
that you would WANT to get an array from a function.. The bottom line is
that sub-queries are a LOT more powerful than arrays..

Getting arrays (or lists) from functions is the core of most functional
programming languages, like APL, Lisp, Scheme. Excel was the first
spreadsheet to adopt this concept. I can see why you find it funny -
you don't understand its use. And it beats 123's @REGRESSION, which
needs convoluted 3rd argument codes to return individual pieces of what
LINEST returns.

If you need a subquery, they may be better than arrays. If you need the
matrix product of two conformant 2D arrays, subqueries just don't cut
it. You don't understand the problem domain (again).
If you weren't SCARED TO BE SEEN AS A DATABASE DEVELOPER you would know
that..

If you weren't so confused into believing that databases are the only
tool you'll ever need, and so ignorant about what OLAP/MDX can't do,
you might have a chance to learn much of what you don't know.
And don't go about calling me one-dimensional.. you're the idiot the
recreates the same report by hand every week LOL

I don't generate any reports. You seem to need constant reminding about
this. Memory as weak as your intellect?

Or are you so incapable of providing serious and detailed examples of
OLAP/MDX or even SQL that all you can manage is attributing to me the
limited set of Excel skills you're barely capable of?
 
yes, but it would be more efficient for you beancounters to grow into VBA
with Acess than VBA for Excel.

I just got kindof burtn out on looping thru cells and looking for values.

You said you were an Excel Power user. You are then supposed to NOT loop
through cells just to look for values. I'd fire you for doing that.

But what you really should do is do a little research on the people you pick
fights with.

Just another free & useful tip from Harald. Give some money to the charity
of your choice the day you choose to use it.
 
(e-mail address removed) wrote...
ps - linest sux anyways you can't even do a 15-digit number???

KB Q277585

Old news. Fixed in Excel 2003. So the comparison is between a
previously flawed but since fixed function in Excel vs no functionality
at all in OLAP/MDX. Do you need a clue that this comparison is
favorable to Excel? It appears you do.
WHAT ARE YOU GOING TO DO WHEN YOU HIT THE 64K LIMIT KID?

Since I don't use spreadsheets as databases, I'd go on using databases
for record-based data.
ARE YOUR NUMBERS SMALL ENOUGH TO USE WITHIN EXCEL?

More Ignorance on Parade! Do you understand what that 15 decimal digit
limit is? It's the maximum precision provided by Excel for 64-bit
floating point numbers. Other software using IEEE 64-bit reals can
provide more *binary* bits of precision, but all are limited to
strictly less than 16 decimal digits of precision. As are OLAP/MDX/SQL
Server.

So if you knew anything about numerical programming you'd know that
this limit applies just the same to the software you've been
advocating. Kinda like believing something is superior because it comes
from the southern hemisphere where the days are longer.
WHAT IF YOUR COMPANY DOES BUSINESS IN ITALY AND YOU HAVE TO USE LIRA INSTEAD
OF DOLLARS (sorry it's euros now huh?)

More & more Ignorance on Parade! Go check the conversion rate of Euros
into Dollars.

Even if Lira were still in use, Italy's GDP might be in the
quadrillions or Lira (US style, 10^12), so within Excel's precision.
But even if it weren't, OLAP/MDX/SQL Server would suffer right along
with Excel.
i just think that it's funny that there are actually grown up people out
there that think that Excel is ALL THAT

And it's sad that there are people who believe they're developers who
know squat all about the limits of numeric precision on digital
computers.
it was all that-- but it hasn't gone anywhere in the past 10 years; no
innovation; other than Office Web Components (an ActiveX control where you
beancounters can have a familiar pivot interface for consuming OLAP
data)
Agreed about damn little innovation in Excel over the last 10 years.
 
(e-mail address removed) wrote...
most db processing does occur on windows

If one wants to total processing hours, you may be correct. If one
totals transaction counts or bytes processed, you have no clue about
mainframes or *x web servers or database servers.
go play with your mac commie

The Microsoft Principle - love Windows = good person, dislike Windows =
degenerate, unAmerican scum. If you beleive the sun shines out of
billg's backside, go burn your lips . . . again.
 
(e-mail address removed) wrote...
Look buddy, C has been dead for 20 years; and if you're a C or C++
developer; you can go and spend 6 months trying to write a website LoL

What languages do you believe Windows, SQL Server or Access are writen
in? Granted C/C++ is a bad choice of programming language for junior
developers, such as you, writing cookie-cutter reporting apps. It's
only meant for real programmers.
It's just a different language-- nobody else in the whole wide world knows
how to make a decent IDE.. and the idea of allocating memory and wasting
time-- that is so 1993-- i mean.. VB takes care of all of that.

To repeat, whatcha think Microsoft uses to code Windows? Or VB itself?

Seen a Mac lately? Reasonable GUI, no VB in sight. Seen any of the
Linux desktop environments? Probably not, but Gnone, KDE, xfce and
WindowMaker are nicer than nearly all the VB applications I've seen.

And inconsistency! Cut & paste not good in spreadsheets, but the ideal
for GUI design? That you have no experience with Tk is obvious.
It is the most popular language in the world.

Defining 'programming language' to include functional languages with
built-in grid controls, Excel is probably the most popular. Certainly
has more users than VB has developers.
Do you honestly think that C or C++ is competive from a developers'
perspective?

For cookbook reporting and most other simple business apps, no way. For
real time monitoring & control, ADA and Forth are the only realistic
alternatives (well, maybe D and Pike too). For implementing programming
languages themselves, only C/C++ has a complete and mature tool set.
 
Back
Top