Ingested Flow Types
This section provides the Consumer Syntax when using various supported parsers and connectors.
IPFIX, NetFlow, and sFlow Parsers
consumers: - name: # Required. An array of properties defining the data consumers configured for FlowLink. For example: netflow parser: type: #Required. Information describing the parser associated with the data consumer. List of supported values: 'netflow', 'ipfix', 'sflow', 'aws', or 'text' connectors: - type: #Required. Information describing the data source connector associated with the data consumer. Supported values: 'udp', 'tcp', 'kafka', or 'aws' properties: ports: #Required parameter to describe tcp or udp port. For example: '2055' remote_addrs: #Optional parameter. String or list of IP address(es) to listen for as trusted data sources. Default is allow all IPs. CIDRs are not supported. For example: '192.168.1.10,192.168.1.15'.
AWS Parser and Connector
consumers: - name: # Required. An array of properties defining the data consumers configured for FlowLink. For example: aws parser: type: #Required. Information describing the parser associated with the data consumer. Supported value: aws connectors: - type: #Required. Information describing the data source connector associated with the data consumer. Supported value: aws properties: region: #Required. Configures the AWS region of where the VPC flow logs are stored. Value not wrapped in quotes. Examples: us-west-2 or us-east-1 credentials: #Required. This is the AWS Access Key ID and AWS Access Key Secret created by IAM. The IAM user must have privileges to read Cloud Watch logs. You can put the contents into a file and run a script to cat the file. Value not wrapped in quotes. For example: $cat /home/employee/aws_info log_groupname: #Required. The name of the AWS Log Group. Value not wrapped in quotes. For example: myVPCFlowLogs
Note
The Access Key ID and Key Secret format should be the same as defined in YAML Configuration.
Text Parser with TCP or UDP Connector
consumers: - name: # Required. An array of properties defining the data consumers configured for FlowLink. For example: syslog parser: type: #Required. Information describing the parser associated with the data consumer. Supported value: 'text' properties: src_ip: #Required. Attribute tag or field number (starting at 1) used to extract source IP. For example: sip dst_ip: #Required. Attribute tag or field number (starting at 1) used to extract destination IP. For example: dip dst_port: #Required. Attribute tag or field number (starting at 1) used to extract destination port. For example: dport protocol: #Required. Attribute tag or field number (starting at 1) used to extract protocol. For example: prot icmp_type: #Optional. Attribute tag or field number (starting at 1) used to extract icmp type. For example: type icmp_code: #Optional. Attribute tag or field number (starting at 1) used to extract icmp code. For example: code timestamp: #Optional. Attribute tag or field number (starting at 1) used to extract timestamp. Default: 1. For example: "date_time, 1" timestamp_format: #Optional. A string used to describe the timestamp format field(s) in a record. The following values can be used year: yy[yy], month(Jan[uary] etc): mmm[mmm], dayOfMonth: dd or _d, dayOfWeek(Mon[day], etc): ddd[ddd], hour: HH, minutes: MM, seconds(with optional precision): SS[.0{1 or more}], timeZone: ZZZ, -HH[:MM], -HHMM, ZHH[:MM], ZHHMM, unix timestamp: unix. For example: "mm dd yyyy HH:MM:SS" connectors: - type: #Required. Information describing the data source connector associated with the data consumer. List of supported values: 'tcp', 'udp', or 'sctp' properties: ports: #Required parameter to describe tcp or udp port. For example: '514' remote_addrs: #Optional. A comma separated list of remote host addresses from which to accept flows. For example: '192.168.200.13'
Text Parser with Kafka Connector
consumers: - name: # Required. An array of properties defining the data consumers configured for FlowLink. For example: syslog parser: type: #Required. Information describing the parser associated with the data consumer. Supported value: 'text' properties: src_ip: #Required. Attribute tag or field number used to extract source IP. For example: sip dst_ip: #Required. Attribute tag or field number used to extract destination IP. For example: dip dst_port: #Required. Attribute tag or field number used to extract destination port. For example: dport protocol: #Required. Attribute tag or field number used to extract protocol. For example: prot icmp_type: #Optional. Attribute tag or field number used to extract icmp type. For example: type icmp_code: #Optional. Attribute tag or field number used to extract icmp code. For example: code timestamp: #Optional. Attribute tag or field number used to extract timestamp. For example: "date_time, 1" timestamp_format: #Optional. A string used to describe the timestamp format field(s) in a record. The following values can be used year: yy[yy], month(Jan[uary] etc): mmm[mmm], dayOfMonth: dd or _d, dayOfWeek(Mon[day], etc): ddd[ddd], hour: HH, minutes: MM, seconds(with optional precision): SS[.0{1 or more}], timeZone: ZZZ, -HH[:MM], -HHMM, ZHH[:MM], ZHHMM, unix timestamp: unix. For example: "mm dd yyyy HH:MM:SS" connectors: - type: kafka properties: version: #Required. The version of the kafka broker(s). For example: 1.2.0 brokers: #Required. A comma separated list of kafka brokers using FQDN and port. For example: example.com:9092 group: test topics: test client_id: flowlink