UNIX-type shared memory under .NET

G

Guest

I would like to start a dialog on how to implement the equivalent
functionality of UNIX shared memory in .NET. I work with a factory
automation system. The bulk of the system is written in C/C++. It was
ported from UNIX to run under Windows using .NET. If for no other reason
than as an educational exercise, I have been wondering what it would take to
rewrite the system under C#.

The system currently has about a 100K “shared memory†segment. Under .NET
we are using memory mapped files. (I think that’s what they’re called!)
Using this, we get convenient and quick access to the data. Much of the data
is static, but a lot of it is dynamic. We have about a dozen core processes
running as services that continually access this data. Any process is
capable of both reading from and/or writing to the shared memory.

Using a very simplified example, we may have the following C structure:

struct
{
char name[9]; // station name
bool loadPresent; // is something currently at the station?
} STATION;

STATION stations[25]; // array of station information
STATION *gStations; // ptr to stations array.
STATION *myStation; // ptr to individual station

A station represents a location along a conveyor where a load may be.

In this simplified example, stations[] would be placed in shared memory.
When each process starts up, the value of gStations would be populated to
point to the beginning of stations[] in shared memory. A function would
exist that, given a station name, would find the appropriate station
information and return myStation as a pointer to that station’s information.

So, given the fact that multiple processes may need to read and update
information in the shared memory segment, what is the best way, under .NET,
to implement the same functionality? This information would be accessed on a
very frequent basis, so speed is critical.

One thought I’ve had is to have some type of server process act as the
guardian of this data. Though I don’t fully understand marshalling and
remoting under .NET, it appears that this is doable. I’m sure I could
eventually figure out the implementation details for this method, but I’m not
sure if it’s a good idea.

Any thoughts on this would be greatly appreciated.

..ARN.
 
D

Dino Chiesa [Microsoft]

All the processes run on a single machine?
And they always will?
Then don't use a server process to hold the data.
Do shared memory, and a machine-wide mutex to gate access to it.

here's an implementation of the p/invoke code to do it.
http://www.winterdom.com/dev/dotnet/index.html

if the processes might be distributed, then build a server process and
expose it via webservices.


-Dino
 
G

Guest

On the current system, all processes run on the same machine. However, if
the system gets re-written, then it's very likely that the system may become
more distributed.

..ARN.
 
L

Leon Lambert

I developed something very similar. It also started on UNIX and
eventually ported to Windows. It is a soft-real-time database that
exists in shared memory allowing very high speed access from multiple
processes. It can also be distributed across multiple computers. I found
it fairly easy to map languages like Java and C# to it. I just created
segregate classes to hold references to the shared memory handles and
used pInvoke to call the shared memory methods. I have read a lot of
complaints about how slow pInvoke can be but by sticking to very
primitive types and simple marshaling schemes I didn't find this to be
true. On newer computers I still get access speeds in the picoseconds.
Following is a tiny snippet of code

internal class LocalRtdrRecord : IRtdrRecord
{
private RecordRefnum refData = null;

protected internal LocalRtdrRecord(RecordRefnum refData)
{
this.refData = refData;
}
/// <summary>
/// Put a data value to the Rtdr
/// </summary>
/// <example>
/// <code>
/// IRtdr rtdr = RtdrFactory.OpenRtdrOnLocalMachine();
/// IRtdrDatabase dr = rtdr.OpenLocalDr();
/// IRtdrRecord slash = dr.SlashRecord;
/// IRtdrRecord rcd =slash.SetDefaultRecord("/Scanner1/BW1/");
/// rcd.PutValue("./LastAverage",0.0);
/// </code>
/// </example>
/// <remarks>
/// This function is polymorphic.This means if the field is a double
/// number type field,
/// the data will be converted to a double then stored.
/// </remarks>
/// <param name="path">Is the path to the value</param>
/// <param name="data">Is the data to store</param>
/// <exception cref="RtdrException">Will be thrown if there is an
error.</exception>
public void PutValue(String path,double data)
{
RtdrInterop.PutValue(refData,path,data);
}
}

internal class RtdrInterop
{
internal static void PutValue(RecordRefnum rcRn,String path,double data)
{
int errorCode;

if ((errorCode = PutDouble(rcRn.RefNum,path,path.Length,data)) != 0)
RtdrException.ThrowRtdrException(errorCode);
}
[DllImport("rtdr.dll", CharSet=CharSet.Ansi,
CallingConvention=CallingConvention.StdCall ,ExactSpelling=true)]
internal static extern int PutDouble(int[] refnum,String path,int
sizeOfPath,double data);
}

Hope this gives you some ideas.
Leon Lambert

I would like to start a dialog on how to implement the equivalent
functionality of UNIX shared memory in .NET. I work with a factory
automation system. The bulk of the system is written in C/C++. It was
ported from UNIX to run under Windows using .NET. If for no other reason
than as an educational exercise, I have been wondering what it would take to
rewrite the system under C#.

The system currently has about a 100K “shared memory†segment. Under .NET
we are using memory mapped files. (I think that’s what they’re called!)
Using this, we get convenient and quick access to the data. Much of the data
is static, but a lot of it is dynamic. We have about a dozen core processes
running as services that continually access this data. Any process is
capable of both reading from and/or writing to the shared memory.

Using a very simplified example, we may have the following C structure:

struct
{
char name[9]; // station name
bool loadPresent; // is something currently at the station?
} STATION;

STATION stations[25]; // array of station information
STATION *gStations; // ptr to stations array.
STATION *myStation; // ptr to individual station

A station represents a location along a conveyor where a load may be.

In this simplified example, stations[] would be placed in shared memory.
When each process starts up, the value of gStations would be populated to
point to the beginning of stations[] in shared memory. A function would
exist that, given a station name, would find the appropriate station
information and return myStation as a pointer to that station’s information.

So, given the fact that multiple processes may need to read and update
information in the shared memory segment, what is the best way, under .NET,
to implement the same functionality? This information would be accessed on a
very frequent basis, so speed is critical.

One thought I’ve had is to have some type of server process act as the
guardian of this data. Though I don’t fully understand marshalling and
remoting under .NET, it appears that this is doable. I’m sure I could
eventually figure out the implementation details for this method, but I’m not
sure if it’s a good idea.

Any thoughts on this would be greatly appreciated.

.ARN.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top