index
int64 3
1.47M
| label
stringclasses 24
values | file
stringclasses 2
values |
---|---|---|
639,667 | BENIGN | finetune_train_A.csv |
174,317 | BENIGN | finetune_train_A.csv |
959,922 | BENIGN | finetune_train_A.csv |
712,711 | BENIGN | finetune_train_A.csv |
574,013 | BENIGN | finetune_train_A.csv |
898,789 | BENIGN | finetune_train_A.csv |
773,566 | BENIGN | finetune_train_A.csv |
699,738 | BENIGN | finetune_train_A.csv |
918,728 | BENIGN | finetune_train_A.csv |
370,805 | BENIGN | finetune_train_A.csv |
98,400 | BENIGN | finetune_train_A.csv |
333,993 | BENIGN | finetune_train_A.csv |
237,385 | BENIGN | finetune_train_A.csv |
292,629 | BENIGN | finetune_train_A.csv |
1,032,933 | BENIGN | finetune_train_A.csv |
1,321,730 | BENIGN | finetune_train_A.csv |
160,944 | BENIGN | finetune_train_A.csv |
626,390 | BENIGN | finetune_train_A.csv |
478,442 | BENIGN | finetune_train_A.csv |
1,202,266 | BENIGN | finetune_train_A.csv |
151,002 | BENIGN | finetune_train_A.csv |
57,024 | BENIGN | finetune_train_A.csv |
481,574 | BENIGN | finetune_train_A.csv |
1,465,226 | BENIGN | finetune_train_A.csv |
879,442 | BENIGN | finetune_train_A.csv |
496,825 | BENIGN | finetune_train_A.csv |
1,344,387 | BENIGN | finetune_train_A.csv |
831,425 | BENIGN | finetune_train_A.csv |
172,687 | BENIGN | finetune_train_A.csv |
1,243,340 | BENIGN | finetune_train_A.csv |
1,379,831 | BENIGN | finetune_train_A.csv |
1,299,703 | BENIGN | finetune_train_A.csv |
1,206,298 | BENIGN | finetune_train_A.csv |
784,892 | BENIGN | finetune_train_A.csv |
1,362,122 | BENIGN | finetune_train_A.csv |
1,083,398 | BENIGN | finetune_train_A.csv |
490,051 | BENIGN | finetune_train_A.csv |
704,898 | BENIGN | finetune_train_A.csv |
582,964 | BENIGN | finetune_train_A.csv |
825,625 | BENIGN | finetune_train_A.csv |
614,179 | BENIGN | finetune_train_A.csv |
315,575 | BENIGN | finetune_train_A.csv |
1,257,817 | BENIGN | finetune_train_A.csv |
1,336,075 | BENIGN | finetune_train_A.csv |
1,318,596 | BENIGN | finetune_train_A.csv |
843,442 | BENIGN | finetune_train_A.csv |
476,304 | BENIGN | finetune_train_A.csv |
644,563 | BENIGN | finetune_train_A.csv |
863,869 | BENIGN | finetune_train_A.csv |
731,437 | BENIGN | finetune_train_A.csv |
549,017 | BENIGN | finetune_train_A.csv |
624,618 | BENIGN | finetune_train_A.csv |
1,102,761 | BENIGN | finetune_train_A.csv |
61,459 | BENIGN | finetune_train_A.csv |
320,568 | BENIGN | finetune_train_A.csv |
942,605 | BENIGN | finetune_train_A.csv |
1,067,154 | BENIGN | finetune_train_A.csv |
1,374,922 | BENIGN | finetune_train_A.csv |
427,092 | BENIGN | finetune_train_A.csv |
237,987 | BENIGN | finetune_train_A.csv |
236,097 | BENIGN | finetune_train_A.csv |
217,259 | BENIGN | finetune_train_A.csv |
1,429,729 | BENIGN | finetune_train_A.csv |
638,762 | BENIGN | finetune_train_A.csv |
1,385,927 | BENIGN | finetune_train_A.csv |
252,902 | BENIGN | finetune_train_A.csv |
180,297 | BENIGN | finetune_train_A.csv |
535,960 | BENIGN | finetune_train_A.csv |
685,602 | BENIGN | finetune_train_A.csv |
1,253,480 | BENIGN | finetune_train_A.csv |
1,325,316 | BENIGN | finetune_train_A.csv |
111,767 | BENIGN | finetune_train_A.csv |
522,375 | BENIGN | finetune_train_A.csv |
16,982 | BENIGN | finetune_train_A.csv |
475,113 | BENIGN | finetune_train_A.csv |
666,954 | BENIGN | finetune_train_A.csv |
681,403 | BENIGN | finetune_train_A.csv |
591,421 | BENIGN | finetune_train_A.csv |
1,228,429 | BENIGN | finetune_train_A.csv |
414,373 | BENIGN | finetune_train_A.csv |
1,377,129 | BENIGN | finetune_train_A.csv |
794,936 | BENIGN | finetune_train_A.csv |
1,269,721 | BENIGN | finetune_train_A.csv |
981,503 | BENIGN | finetune_train_A.csv |
624,272 | BENIGN | finetune_train_A.csv |
338,556 | BENIGN | finetune_train_A.csv |
1,404,144 | BENIGN | finetune_train_A.csv |
1,253,495 | BENIGN | finetune_train_A.csv |
258,590 | BENIGN | finetune_train_A.csv |
119,968 | BENIGN | finetune_train_A.csv |
646,932 | BENIGN | finetune_train_A.csv |
1,437,127 | BENIGN | finetune_train_A.csv |
457,176 | BENIGN | finetune_train_A.csv |
277,487 | BENIGN | finetune_train_A.csv |
1,183,952 | BENIGN | finetune_train_A.csv |
182,925 | BENIGN | finetune_train_A.csv |
1,088,511 | BENIGN | finetune_train_A.csv |
1,352,155 | BENIGN | finetune_train_A.csv |
1,127,122 | BENIGN | finetune_train_A.csv |
529,720 | BENIGN | finetune_train_A.csv |
Network Traffic Embeddings Dataset
Model Description
This dataset contains embeddings generated from the CICIDS2017 network traffic dataset using a fine-tuned Meta-Llama-3.1-70B-Instruct model. The embeddings represent network traffic flows formatted in a structured way to capture key network traffic characteristics.
Structure of Embeddings Files
combined.npy
The combined.npy
file contains a NumPy array of shape (N, D) where:
- N is the total number of samples across all processed weekday files
- D is the embedding dimension (determined by the Llama model's hidden size)
This file stores the raw embedding vectors in a dense format for efficient loading and processing. Each row represents the embedding for a single network traffic flow.
combined.csv
The combined.csv
file contains metadata for each embedding in the corresponding combined.npy
file. The CSV has the following columns:
index
: The original index of the sample in its source dataset filelabel
: The classification label of the traffic flow (BENIGN or specific attack type)file
: The source file name (e.g., 'Monday.csv', 'Tuesday.csv', etc.)
The rows in this CSV file directly correspond to the embeddings in the .npy
file, maintaining the same order.
Data Processing
The embeddings were generated by:
Processing network flow data from five weekday files of the CICIDS2017 dataset (Monday through Friday)
Sampling up to 25,000 benign samples and 25,000 malicious samples from each day (with the exception of Monday which contains only benign traffic)
Formatting each network flow as structured text with the following fields:
- Source IP and Port
- Destination IP and Port
- Protocol
- Traffic Volume (bytes in both directions)
- Packet counts
- TCP Flag information
- Flow Duration
Extracting embeddings using a fine-tuned Meta-Llama-3.1-70B-Instruct model
Using mean pooling over all tokens (excluding padding) to create a fixed-size embedding for each flow
Usage Information
These embeddings can be loaded and used in Python as follows:
import numpy as np
import pandas as pd
# Load the embeddings
embeddings = np.load('combined.npy')
# Load the metadata
metadata = pd.read_csv('combined.csv')
# Example: Get all embeddings for benign traffic
benign_indices = metadata[metadata['label'].str.upper() == 'BENIGN'].index
benign_embeddings = embeddings[benign_indices]
# Example: Get embeddings for a specific attack type
attack_indices = metadata[metadata['label'] == 'DoS Hulk'].index
attack_embeddings = embeddings[attack_indices]
# Example: Get embeddings from a specific day
wednesday_indices = metadata[metadata['file'] == 'Wednesday.csv'].index
wednesday_embeddings = embeddings[wednesday_indices]
Model Information
- Base Model: Meta-Llama-3.1-70B-Instruct
- Fine-tuned Model Path: cicids_finetuned/checkpoint-585/
- Embedding Extraction: Last layer hidden states with mean pooling
- Embedding Dimension: [Dimension size determined by model]
Limitations
These embeddings were created from a specific dataset (CICIDS2017) and may not generalize to all network environments or to newer attack types that were not present in the original dataset.
- Downloads last month
- 44