8
I’m integrating a system with a bank file, and I’m having a problem with the process. I receive from the credit card operator a plain text file with approximately 1300Kb and about 5,500 lines.
I’m reading this file and storing it in a ClientDataSet
, in memory only, I do not enter in the database at any time. However I’m finding that the reading process is very slow, since I’m managing to record at a rate of about 13 lines per second, I don’t have much experience with ClientDataSet
and I don’t know if this rate is acceptable.
To read the file I import the text file to a StringList
, then loop based on the number of lines in the file, importing each type of record to its respective ClientDataSet
(Inside the file can have 9 different types of records, and each one I put in one ClientDataSet
different).
Loop in the archive:
EnableDisableControls(False);
{ Percorre todas as linhas do arquivo verificando o tipo e executando uma rotina para cada tipo de registro. }
for I := 0 to (Extrato.Count - 1) do
begin
CurrentLine := I;
case StrToInt(Copy(Extrato.Strings[I], 01, 01)) of
0: LerHeader;
1: LerRegistroDetalheRO; //Resumo de Operação
2: LerRegistroDetalheCV; //Comprovante de Venda
3: LerRegistroDetalheIDROSA; //Informativo detalhe do RO do Saldo em Aberto
4: LerRegistroDetalheIBSA; //Informativo por bandeira do Saldo em Aberto
5: LerRegistroDetalheIOAR; //Informativo de Operação de Antecipação de Recebíveis
6: LerRegistroDetalheIRODA; //Informações de RO da data antecipada
7: LerRegistroDetalheIDRODA; //Informações de débitos de ROs da data antecipada
9: LerTrailer;
end;
end;
{ Reativar todos os controles após a inserção. }
EnableDisableControls(True);
If the record is type 2, for example, I call the respective recording time ClientDataSet
:
procedure ThreadProcessarExtrato.LerRegistroDetalheCV;
begin
with FrmExtratoEletronicoCielo, Extrato, DSDetalheCV.DataSet do
begin
Insert;
FieldByName('TIPO_REGISTRO').AsString := Copy(Strings[CurrentLine], 001, 1);
FieldByName('ESTAB_SUBMISSOR').AsString := Copy(Strings[CurrentLine], 002, 10);
FieldByName('NUMERO_RO').AsString := Copy(Strings[CurrentLine], 012, 7);
FieldByName('NUMERO_CARTAO').AsString := Copy(Strings[CurrentLine], 019, 19);
...
...
...
//Aqui existem muitos outros campos que são atribuídos, retirei para ficar menor...
...
...
...
Post;
Inc(TotalRegistrosCV);
end;
end;
So, any idea how to expedite this process? Or a better way to accomplish this process?
@Tiago do not know if I understand your approach, but I receive pure text from the credit card operator, the records are delimited only according to the position of the characters, so I believe it is not possible to use XML Mapper.
– adamasan
I thought it was XML. Maybe work with
Append
instead ofInsert
be faster. You can do severalAppend´s
and only givePost
in the end. Yet, out of this, you can try to decrease the amount of use ofFieldByName
.– user3628
Work with
TextFile
I believe it’s faster.– user3628
I’ll do some tests trying these alternatives @Tiago.
– adamasan
Arthur I’m doing the same integration I’m having trouble reconciling the information of one record with another ex: in the file comes the records 0 - header 1 - Detail of RO 2 - Detail of CV 9 - Trailer has some field that you used to integrate the two operations of the & #Xa; file field that contains the record type 1 and 2 of the file to match the information.
– user14223
@I don’t quite understand your question. But from what I understand you want to know how I linked the fields of the Operation Summary (RO) with Proof of Sale (CV). If so, I created a master-Detail relationship between the two records using the RO Unico Number field (in the RO) and the first 22 digits of the Transaction Unique Number field (in the CV). This for the CV Payment file, which is one of the two file types I implemented.
– adamasan