S
Simon Harvey
Festive greetings fellow programmers!
I've been programming now for about 4, maybe 5 years now. 4 of those years
were at university so and I havent had much work experience of making real
world applications (although I trying to make some now). There is still a
lot I don't know when it comes to making programs. I know all the theory,
but not how (and why) certain things are done in real world projects.
My current ponderings are about interfaces. I think I understand the
theory -
An interface is used to define a contract between two entities. One entity
implements an interface, and the other entity can program against that
interface and know that whatever object is there at runtime - as long as the
interface is implemented - everything will be fine.
Well, thats nice then. I can understand why that might be a good idea on
occasion (in theory). You being able to say - "Ok, I don't want to know what
the actual object is as such; I just want to know that it will fulfill its
obligations".
The thing I'm crap at is - knowing when to create an interface.
How do you know? What objects should have interfaces made for them? I havent
got a clue because I was never taught about it.
I'm currently trying to make an email application that will store received
messages on the file system. I'm trying to figure out if I should make any
interfaces - but I just don't know.
I could make an interface for loads of objects, but I'm not sure what the
point would be. I'm sure I should have at least some interfaces, but I don't
know where to make them and so on. It's really a problem of implementation.
Could anyone offer me some general advice on how you can spot potential
interfaces. I mean, there must be some approach to it; some sort of rules
that developers either conciously or subconsiously apply. If anyone could
offer any advice at all on how to spot/determine when an interface should be
employed, then that would be excellent.
Many thanks to anyone who can help.
Kind Regards
Simon
I've been programming now for about 4, maybe 5 years now. 4 of those years
were at university so and I havent had much work experience of making real
world applications (although I trying to make some now). There is still a
lot I don't know when it comes to making programs. I know all the theory,
but not how (and why) certain things are done in real world projects.
My current ponderings are about interfaces. I think I understand the
theory -
An interface is used to define a contract between two entities. One entity
implements an interface, and the other entity can program against that
interface and know that whatever object is there at runtime - as long as the
interface is implemented - everything will be fine.
Well, thats nice then. I can understand why that might be a good idea on
occasion (in theory). You being able to say - "Ok, I don't want to know what
the actual object is as such; I just want to know that it will fulfill its
obligations".
The thing I'm crap at is - knowing when to create an interface.
How do you know? What objects should have interfaces made for them? I havent
got a clue because I was never taught about it.
I'm currently trying to make an email application that will store received
messages on the file system. I'm trying to figure out if I should make any
interfaces - but I just don't know.
I could make an interface for loads of objects, but I'm not sure what the
point would be. I'm sure I should have at least some interfaces, but I don't
know where to make them and so on. It's really a problem of implementation.
Could anyone offer me some general advice on how you can spot potential
interfaces. I mean, there must be some approach to it; some sort of rules
that developers either conciously or subconsiously apply. If anyone could
offer any advice at all on how to spot/determine when an interface should be
employed, then that would be excellent.
Many thanks to anyone who can help.
Kind Regards
Simon