Answer the question
In order to leave comments, you need to log in
What is the best way to process and record a large amount of data?
Good day Colleagues.
I am writing one service for processing incoming data.
The data looks like: {sip:"1.1.1.1:11233", title:"test", value:"log"}
sip: this is the source of the data.
title: title, with one title several data can come for a specific sip
value: the data itself (string)
data I will store unique for each sip+title
let's say:
sip:1.1.1.1:11233, title:block, value:wr3333333333
sip:1.1.1.1:11233, title:block, value:wefwefwfwefew
sip:1.1.1.1:11233, title:view, value:ppppppp
$data_ara[$sip][$title][]=$value;
further, it would simply process the array before recording, removing duplicates for each title, form an insert query and write the data to mysql.
ps: I’ll clarify, I don’t write the data right away due to the fact that they can come in for example 30-40 per second, this is a lot to write them somewhere indiscriminately, it’s more logical to process them and write them down in one action every 5 seconds (let’s say) .
Answer the question
In order to leave comments, you need to log in
For example like this:
var myDataObject = {}
, dataExample1 = { sip:"1.1.1.1:11233", title:"block", value:"val 1"}
, dataExample2 = { sip:"1.1.1.1:11233", title:"block", value:"val 2"}
, dataExample3 = { sip:"1.1.1.1:11233", title:"view", value:"val 3"}
;
addMyData = function(data){
var exData = myDataObject[data.sip]
, exTitle
;
if( !exData ){
exData = myDataObject[data.sip] = { title: {} }
}
exTitle = exData.title[data.title];
if( !exTitle ){
exTitle = exData.title[data.title] = []
}
exTitle.push( data.value );
}
addMyData(dataExample1);
addMyData(dataExample2);
addMyData(dataExample3);
console.log(myDataObject);
console.info(JSON.stringify(myDataObject, null, 4));
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question