You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to recreate the results published in the paper by using the EventBasedDetector with the Telegram Datasets..
Because the Code isn't runnable like that, I changed the loading part of it and preparsed the .pcaps and Message-Traces files to .csv files.
Everything from the logic itself I didn't changed.
However when i try to run it I get these results.
For 180-second interval:
Matchrate TP: 0.15398632970137208
Matchrate FP: 0.0030533821874483323
from 472 Total intervals
For 300-second interval:
Matchrate TP: 0.16517058456417724
Matchrate FP: 0.0033411625104745827
from 418 Total intervals
For 900-second interval:
Matchrate TP: 0.1922539784404055
Matchrate FP: 0.004415161263713601
from 271 Total intervals
For 1800-second interval:
Matchrate TP: 0.23280624793962684
Matchrate FP: 0.005708325299575498
from 172 Total intervals
(The Matchrates are calculated by the average over all Files and Intervals in that given observation timespan)
The TP are very low and dont come even close to the ones published in the paper.
Hello,
I'm trying to recreate the results published in the paper by using the EventBasedDetector with the Telegram Datasets..
Because the Code isn't runnable like that, I changed the loading part of it and preparsed the .pcaps and Message-Traces files to .csv files.
Everything from the logic itself I didn't changed.
However when i try to run it I get these results.
(The Matchrates are calculated by the average over all Files and Intervals in that given observation timespan)
The TP are very low and dont come even close to the ones published in the paper.
What am I doing wrong here ?
The Modified JupiterNotebook (Python 3.11.3):
The converted Dataset that I used
(converted from the first 100 files in
\telegram\normal\adversary_message_traces
and\telegram\normal\pcaps
)The text was updated successfully, but these errors were encountered: