Delphi Datasnap – Memory Leak and it’s workaround.

Datasnap – Fetching data using SQL from the server…. WITHOUT crashing the server!

 

For the past few weeks, I had been noticing a snag in my application that kep coming up randomly. This was to do with calling the a particular function that was responsible for taking an Sql as a string and return a DatasetProvider name so that the data could be accessed on the client side’s client dataset.

 

The function was written something like this on the Datasnap server

 

function GetDataForClient(SQL: string): stirng;

var

qryNew: TADOQuery;

dspNew: TDatasetProvider;

begin

qryNew := TADOQuery.Create(nil);

qryNew.Connection := SomeADOConnectionObject;

qryNew.SQL.Text := SQL;

qryNew.Name := ‘qry’+IntToStr(UniqueID);

 

dspNew := TDatasetProvider.Create(nil);

dspNew.Name := ‘dsp’+IntToStr(UniqueID);

dspNew.Dataset := qryNew;

 

qryNew.Open;

 

Result := dspNew.Name;

end;

 

So, basically, as you may have already figured out, the function is meant to be called on the client side as follows (assuming all other properties were set correctly and DS server is running):

 

procedure ClientFetchData;

begin

ClientDataset.ProviderName := DatasnapServerMethods.GetDataForClient(‘select * from employees;’);

ClientDataset.Open;

end;

 

 

The above code worked quite well for… until sometime… as the application scaled and the function kept being calling more often. Randomly, the application would throw an exception of invalid memory.

 

I tried several things after that, such as releasing the object after the data was fetched at the client side, running the query in a thread and then freeing the thread’s objects and the thread itself. But for some reason, the error wouldn’t go away, though the occurrence of it had reduced to only once or twice a day compared to the previous occurrence of every other hour.

 

Anyhow, this wasn’t something I could ignore for a long time so I decided to rough it out and finally, I found a work around that worked beautifully…. Dataset Pool!

 

What I did was create a pool of 20 queries objects and it’s dataset providers when the DS server session was created and destroy them when the session was destroyed. I also maintained the last accessed query/provider pair so as to redirect the next call to the next object in the pool. The moment I reached the last object, I reset the count to access the first object of the pool.

 

The basic reason to do that was to ensure that I am not creating any objects dynamically on the call of the function (as mentioned in GetDataForClient).

 

Here’s the implementation of the functionaility.

 

TDataPair = class

public

Query: TADOQuery;

Provider: TDataSetProvider;

end;

 

The basic class that holds the pair of query/dataset.

 

TServerMethods2 = class(TDSServerModule)

cnt1: TADOConnection;

procedure DSServerModuleCreate(Sender: TObject);

procedure DSServerModuleDestroy(Sender: TObject);

private

{ Private declarations }

FPoolSize: Integer; // the pool size

FDataPool: TStringList; // the stirng list object that which acts as the pool

FLastObjIndex: Integer; // var that maintains the last accessed object from the pool

public

{ Public declarations }

function GetData(SQL: string): Integer;

end;

 

procedure TServerMethods2.DSServerModuleCreate(Sender: TObject);

var

I: Integer;

DP: TDataPair;

begin

FLastObjIndex := -1;

FPoolSize := 20;

FDataPool := TStringList.Create;

for I := 0 to FPoolSize – 1 do

begin

DP := TDataPair.Create;

 

DP.Query := TADOQuery.Create(Self);

DP.Query.Name := ‘qry’+IntToStr(I);

DP.Query.ConnectionString := ‘<<some valid connection string>>’;

 

DP.Provider := TDataSetProvider.Create(Self);

DP.Provider.Name := ‘dsp’+IntToStr(I);

DP.Provider.DataSet := DP.Query;

 

FDataPool.AddObject(IntToStr(I), DP);

end;

 

end;

 

procedure TServerMethods2.DSServerModuleDestroy(Sender: TObject);

var

I: Integer;

DP: TDataPair;

begin

for I := 0 to FPoolSize – 1 do

begin

DP := (FDataPool.Objects[I] as TDataPair);

DP.Query.Close;

DP.Query.Free;

DP.Provider.Free;

DP.Free;

end;

FDataPool.Free;

end;

 

function TServerMethods2.GetData(SQL: string): Integer;

var

DP: TDataPair;

begin

if FLastObjIndex = 19 then

FLastObjIndex := 0

else

Inc(FLastObjIndex);

 

DP := (FDataPool.Objects[FLastObjIndex] as TDataPair);

DP.Query.Close;

DP.Query.SQL.Text := SQL;

DP.Query.Open;

 

Result := FLastObjIndex;

end;

 

 

On the client side, this would what the call would look like:

 

procedure TForm7.btn1Click(Sender: TObject);

begin

cds1.Close;

cds1.ProviderName := ‘dsp’ + IntToStr(Server.GetData(‘select * from employees;’));

cds1.Open;

end;

 

The above implementation does exactly what the GetDataForClient function did earlier except that it doesn’t create any new objects on the server. Plus, it resolved the access violation exception.

 

HTH.

Happy coding!

5 thoughts on “Delphi Datasnap – Memory Leak and it’s workaround.

  1. I’m using the same technique as yours in order to get a more dynamic query result from server. But instead of using a pool, i’m doing a search for the component pair before create new pair.
    It work like this:
    – Client ask for a query named GetQuery_ItemList for example
    – Server first find a component with the same as the query name
    – If it found it, then the server return the datasetprovider name
    – But if not found, then server create a new dataset and datasetprovider pair

    Of course the scenario is that the query syntax is not sent directly by the client, but it is stored in database.

    But right now, i’m facing a memory leak problem. As the user login or logout, the memory is not freed, so after a month it consume a lot of memory. I’m still searching for a way to look what object left in memory.

    • I would think the problem is in the dynamic creation of the components within the method call of the server. I think the better approach is to use the pool with varying SQLs (if that’s possible) instead of dynamic creation. Hope that helps.

      • I’m using the datasetprovider for transaction too, i think with pooling the dataset that would make more problem in handling the transaction. i think i need to find the problem first. I’ve try using a utility to read what is in the memory used by a process. The tools gives me a lot information about what is in the memory, but i still could not understand what component did not get destroyed when a client disconnect.

      • I agree with you on finding the exact reason but sometimes, time is not a luxury and customers do not want to wait for a solution. Depending on how you are placed with your problem, you can take a call. I would rather query the database from the client side, pass the SQL to the pool and get the results, if that’s possible.

        Anyway, I have moved on from there so if you find the reason and the solution, please do share.

  2. Thanks you very much nkaku, after long time tried and search the way out, finally you have nice and perfect solution for my problem

Leave a reply to ralod Cancel reply