Compare commits

...

44 Commits

Author SHA1 Message Date
David Levanon
32d473ea26 Add service mesh badge (#989)
Co-authored-by: M. Mert Yıldıran <mehmet@up9.com>
Co-authored-by: M. Mert Yıldıran <mehmet@up9.com>
Co-authored-by: Nimrod Gilboa Markevich <59927337+nimrod-up9@users.noreply.github.com>
2022-04-13 13:30:57 +03:00
M. Mert Yıldıran
97000293fd Fix the unit tests of protocol extensions (#998) 2022-04-13 12:23:08 +03:00
gadotroee
3ed9bc1e0d Upgrade Basenine version to v0.7.1 (#984)
* Basenine MongoDB mess

* Fix more

* Fix the `mongodb` container arguments

* Add Basenine ARM64 binary

* Make the changes related to `leftOff` becoming a string value

* Make `leftOffTop` state string

* Handle `CloseConnection` in `Fetch`

* Upgrade Basenine to `0.7.0`

* Revert the changes in `package.json` and `package-lock.json`

* Fix the `Dockerfile`

* Remove the binaries

* Increase the Basenine up deadline to 20 seconds

* Revert the changes in `shared/kubernetes/provider.go`

* Fix the OAS generator tests

* Protect from race condition

* Fix mutexes

* Fix unlock

* Fix logging data types

* Try to stabilize the tests

* Remove the `replace` statement

* revert the change the done in 2899414f2b to not change the leftOff

* Change `leftOffBottom` empty string default value to `latest`

* Upgrade Basenine to `0.7.1`

* Handle the Basenine client library errors better

* Use `DEFAULT_QUERY` constant

* Remove `min=-1`

* Replace some `Errorf`s with `Panicf`s

* Remove the closure in `runGenerator` method

* Remove an unnecessary check

Co-authored-by: M. Mert Yildiran <mehmet@up9.com>
Co-authored-by: Andrey Pokhilko <apc4@ya.ru>
Co-authored-by: undera <undera@undera-old-desktop.home>
Co-authored-by: AmitUp9 <96980485+AmitUp9@users.noreply.github.com>
2022-04-13 11:28:48 +03:00
David Levanon
86e5dcea19 Tap TCP connections even if the handshake was missed (#988)
Support long-living connections. This commit improves support for Linkerd which uses long-living connections.
2022-04-13 11:17:37 +03:00
lirazyehezkel
81fe4af30d close ws on modals open (#994) 2022-04-12 17:56:03 +03:00
leon-up9
df1fd2c3a7 Ui/Resiszable (#990)
* fixed toast
fixed filter refresh on reload

* revarted

* sticky selectlist header

* apply check to filtered items

* grpc filter Bug

* should almost fix filtering

* working without disabled

* handle disabled items

* small refactor

* almost working with weird jesture

* test

* servicesFilterList height

* fixed to work

* refresh margin

* after PR notes

* remove redunded var

* pr review

* Pr comments

* removed line

* removed redundant

* nullable check

Co-authored-by: Leon <>
2022-04-12 16:27:20 +03:00
lirazyehezkel
f8496c0235 Fix screen layout (#993)
Co-authored-by: gadotroee <55343099+gadotroee@users.noreply.github.com>
2022-04-12 11:20:02 +03:00
Nimrod Gilboa Markevich
2de7107c0a Use author instead of commiter in slack alerts (#992) 2022-04-12 10:48:50 +03:00
leon-up9
22e3b3d8b2 ServiceMapModal filters (#981)
* fixed toast
fixed filter refresh on reload

* revarted

* sticky selectlist header

* apply check to filtered items

* grpc filter Bug

* should almost fix filtering

* working without disabled

* handle disabled items

* small refactor

* servicesFilterList height

* after PR notes

* remove redunded var

* pr review

Co-authored-by: Leon <>
2022-04-11 17:26:28 +03:00
lirazyehezkel
45611c4c13 TRA-4477 FE holds limit of 10000 entries (#987)
* FE holds limit of 10000 entries

* let to const
2022-04-11 15:04:42 +03:00
RoyUP9
bb425fa6e2 Fixed service map unresolved bug (#986) 2022-04-11 14:37:53 +03:00
lirazyehezkel
4bc83ebcb5 Fix WS error when switching from settings to traffic viewer (#985) 2022-04-11 14:21:31 +03:00
M. Mert Yıldıran
bbb44dae79 Fix the unit tests of protocol extensions (#982) 2022-04-09 06:56:09 -07:00
M. Mert Yıldıran
72a1aba3e5 TRA-4410 Display namespace field in the UI (#974) 2022-04-08 21:16:25 +03:00
RoyUP9
d8fb8ff710 Fix for OAS reset not working (#978) 2022-04-07 18:14:03 +03:00
Nimrod Gilboa Markevich
f344bd2633 Make minor changes to OasGenerator (#977)
* Added log message
* Remove Reset function from OasGenerator interface, use Stop+Start instead
* SetEntriesQuery returns a bool stating whether the query changed
2022-04-07 10:46:30 +03:00
M. Mert Yıldıran
6575495fa5 Remove gRPC related modifications (#958)
* Remove gRPC related modifications

* Remove gRPC status text related modifications as well

* Fixing gRPC vertical image

detect grpc when content type is 'application/grpc' as well  (and not only from the grpc-status)

Co-authored-by: gadotroee <55343099+gadotroee@users.noreply.github.com>
2022-04-06 18:50:36 +03:00
RoyUP9
cf5c03d45c Fixed service map returning nil values (#975) 2022-04-06 13:12:38 +03:00
Nimrod Gilboa Markevich
491da24c63 Add ability to set query in OAS Generator (#964) 2022-04-06 11:54:55 +03:00
leon-up9
832162ae0f Ui/Service-map-split-to-ui-common (#966)
* added serviceModal & selectList & noDataMessage
removed leftovers from split

* scroll fix

* sort by name

* search alightment

* space removed

* margin-bottom

* utils class

Co-authored-by: Leon <>
Co-authored-by: Igor Gov <iggvrv@gmail.com>
2022-04-05 13:23:46 +03:00
lirazyehezkel
866378b451 Mizu cant show more than 10000 entries (#973) 2022-04-05 12:25:01 +03:00
lirazyehezkel
0b0b9ce6d1 Performance fixes (#972) 2022-04-04 20:03:57 +03:00
RoyUP9
d99c632102 Fixed golint strings.Title is deprecated error (#971) 2022-04-04 18:06:22 +03:00
RamiBerm
76a6a77a14 Refactor ws (#961)
* Separate socket and basenine logic

* WIP

* Update socket_server_handlers.go

* Update socket_data_streamer.go and socket_server_handlers.go

* Update socket_server_handlers.go

* Merge branch 'develop' into refactor_ws
# Please enter a commit message to explain why this merge is necessary,
# especially if it merges an updated upstream into a topic branch.
#
# Lines starting with '#' will be ignored, and an empty message aborts
# the commit.

* empty commit for actions

* empty commit for actions

* commit for actions

* Revert "commit for actions"

This reverts commit 8ba2ecf7d3.

Co-authored-by: RoyUP9 <87927115+RoyUP9@users.noreply.github.com>
2022-04-04 17:33:53 +03:00
M. Mert Yıldıran
2bfc523bbc Handle reflect.TypeOf returning nil case (#970) 2022-04-04 16:25:18 +03:00
RoyUP9
66ba778384 Fixed golint modified files (#969) 2022-04-04 15:32:22 +03:00
leon-up9
7adbf7bf1b Ui/TRA-4461_service-map-&-OAS---GUI-changes (#962)
* OpenAPI renamed to Service Catalog

* Docs icon change
Hide Navbar on serviceMap modal open

* PR comments

Co-authored-by: Leon <>
Co-authored-by: RoyUP9 <87927115+RoyUP9@users.noreply.github.com>
2022-04-04 14:49:41 +03:00
Igor Gov
a97b5b3b38 Add conditional Go lint validation to CI (#967) 2022-04-04 14:35:47 +03:00
Nimrod Gilboa Markevich
aa8dcc5f5c Format commit message as code to handle multi line messages (#963)
Co-authored-by: gadotroee <55343099+gadotroee@users.noreply.github.com>
2022-04-03 22:10:43 +03:00
lirazyehezkel
9d08dbdd5d UI performance fix 2022-04-03 21:35:09 +03:00
lirazyehezkel
b47718e094 TRA-4442 Improve UI performance (#960)
* Move ws entry listener to entriesList component

* unused code
2022-04-03 15:51:20 +03:00
Igor Gov
6a7fad430c Adding resolved prop to service map node (#959)
* Adding resolved prop to service map node

* fixing tests
2022-04-03 15:32:21 +03:00
lirazyehezkel
59ad8d8fad TLS icon position (#956)
* TLS icon position

* cr fix
2022-03-31 11:26:58 +03:00
Nimrod Gilboa Markevich
a49443f101 Set the entry namespace to the source namespace if the destination is not resolved (#950)
* Set the entry namespace to the source namespace if the destination is not resolved
* Overwrite src namespace with dst namespace only if dst non-empty
2022-03-30 15:40:21 +03:00
lirazyehezkel
2427955aa4 Avoid overlap only for service map including under 10 services 2022-03-30 15:30:09 +03:00
David Levanon
27a73e21fb Read from service mesh network namespaces upon update (#944) 2022-03-30 13:56:37 +03:00
Igor Gov
8eeb0e54c9 Changing unit tests workflow timeout to 30 minutes 2022-03-30 11:52:47 +03:00
Andrey Pokhilko
97db24aeba OAS: rework data feeding + sampleIDs (#917)
* Call OAS feeder

* Don't call old OAS code

* Rework calls

* Work on it

* Put back rules

* Make it compile

* start thinking of test

* Compiles

* Save

* Fixes

* Save

* Fixing

* Trying to fake conn

* add timeout

* Test timeout

* Fix tests

* Only build OAS for HTTP entries

* Remove some dead code

* Adding SampleIDs

* Cosmetics

* lint

* Revert rename

* Sample ID for content

* Cleanuo

* Add more sample IDs

* Checking hypothesis

* Move assignment place a bit

* Cosmetics

* Update test.yml

Co-authored-by: undera <undera@undera-old-desktop.home>
Co-authored-by: Igor Gov <iggvrv@gmail.com>
2022-03-30 11:14:25 +03:00
RamiBerm
63cf7ac34e Refactor entries controller logic (#949)
* wip

* Update entries_controller.go and entries_provider.go

* Update entries_controller.go

* change entries provider into a struct + interface

* Update entries_provider.go

* Update entries_provider.go
2022-03-29 18:30:19 +03:00
Igor Gov
e867b7d0f1 Build ui-common part of CI (#914)
* Build ui-common always locally
2022-03-29 14:14:52 +03:00
lirazyehezkel
dcd8a64f43 Hotfix/remove token from community (#948) 2022-03-29 13:16:50 +03:00
lirazyehezkel
bf8d5ed069 Support multiple workspaces (TRA-4365) (#945)
* support multiple workspaces

* reopen by websocket url dep

* open websocket only when websocketURL is changed

* upgrade common version

Co-authored-by: gadotroee <55343099+gadotroee@users.noreply.github.com>
2022-03-29 09:56:59 +03:00
Nimrod Gilboa Markevich
1f6e539590 Add commit message and committer to acceptance tests slack alert (#946)
* Add commit message and committer username to slack alerts
* Use name instead of username
* Use name and email
2022-03-29 09:15:42 +03:00
David Levanon
590fa08c81 EBPF error handling 2022-03-28 14:19:06 +03:00
133 changed files with 5867 additions and 1817 deletions

View File

@@ -43,7 +43,7 @@ jobs:
with:
status: ${{ job.status }}
notification_title: 'Mizu {workflow} has {status_message}'
message_format: '{emoji} *{workflow}* {status_message} during <{run_url}|run>, after commit: <{commit_url}|{commit_sha}>'
message_format: '{emoji} *{workflow}* {status_message} during <{run_url}|run>, after commit <{commit_url}|{commit_sha}> by ${{ github.event.head_commit.author.name }} <${{ github.event.head_commit.author.email }}> ```${{ github.event.head_commit.message }}```'
footer: 'Linked Repo <{repo_url}|{repo}>'
notify_when: 'failure'
env:

View File

@@ -45,7 +45,7 @@ jobs:
- name: Check modified files
id: modified_files
run: devops/check_modified_files.sh agent/ shared/ tap/ ui/ Dockerfile
run: devops/check_modified_files.sh agent/ shared/ tap/ ui/ ui-common/ Dockerfile
- name: Set up Docker Buildx
if: steps.modified_files.outputs.matched == 'true'

View File

@@ -15,6 +15,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 2
- uses: actions/setup-go@v2
with:
go-version: '^1.17'
@@ -24,67 +27,117 @@ jobs:
sudo apt update
sudo apt install -y libpcap-dev
- name: Check Agent modified files
id: agent_modified_files
run: devops/check_modified_files.sh agent/
- name: Go lint - agent
uses: golangci/golangci-lint-action@v2
if: steps.agent_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: agent
args: --timeout=3m
- name: Check shared modified files
id: shared_modified_files
run: devops/check_modified_files.sh shared/
- name: Go lint - shared
uses: golangci/golangci-lint-action@v2
if: steps.shared_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: shared
args: --timeout=3m
- name: Check tap modified files
id: tap_modified_files
run: devops/check_modified_files.sh tap/
- name: Go lint - tap
uses: golangci/golangci-lint-action@v2
if: steps.tap_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap
args: --timeout=3m
- name: Check cli modified files
id: cli_modified_files
run: devops/check_modified_files.sh cli/
- name: Go lint - CLI
uses: golangci/golangci-lint-action@v2
if: steps.cli_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: cli
args: --timeout=3m
- name: Check acceptanceTests modified files
id: acceptanceTests_modified_files
run: devops/check_modified_files.sh acceptanceTests/
- name: Go lint - acceptanceTests
uses: golangci/golangci-lint-action@v2
if: steps.acceptanceTests_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: acceptanceTests
args: --timeout=3m
- name: Check tap/api modified files
id: tap_api_modified_files
run: devops/check_modified_files.sh tap/api/
- name: Go lint - tap/api
uses: golangci/golangci-lint-action@v2
if: steps.tap_api_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap/api
- name: Check tap/extensions/amqp modified files
id: tap_amqp_modified_files
run: devops/check_modified_files.sh tap/extensions/amqp/
- name: Go lint - tap/extensions/amqp
uses: golangci/golangci-lint-action@v2
if: steps.tap_amqp_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap/extensions/amqp
- name: Check tap/extensions/http modified files
id: tap_http_modified_files
run: devops/check_modified_files.sh tap/extensions/http/
- name: Go lint - tap/extensions/http
uses: golangci/golangci-lint-action@v2
if: steps.tap_http_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap/extensions/http
- name: Check tap/extensions/kafka modified files
id: tap_kafka_modified_files
run: devops/check_modified_files.sh tap/extensions/kafka/
- name: Go lint - tap/extensions/kafka
uses: golangci/golangci-lint-action@v2
if: steps.tap_kafka_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap/extensions/kafka
- name: Check tap/extensions/redis modified files
id: tap_redis_modified_files
run: devops/check_modified_files.sh tap/extensions/redis/
- name: Go lint - tap/extensions/redis
uses: golangci/golangci-lint-action@v2
if: steps.tap_redis_modified_files.outputs.matched == 'true'
with:
version: latest
working-directory: tap/extensions/redis

View File

@@ -18,6 +18,7 @@ jobs:
run-unit-tests:
name: Unit Tests
runs-on: ubuntu-latest
timeout-minutes: 20
steps:
- name: Check out code into the Go module directory
uses: actions/checkout@v2

View File

@@ -1,12 +1,23 @@
ARG BUILDARCH=amd64
ARG TARGETARCH=amd64
### Front-end common
FROM node:16 AS front-end-common
WORKDIR /app/ui-build
COPY ui-common/package.json .
COPY ui-common/package-lock.json .
RUN npm i
COPY ui-common .
RUN npm pack
### Front-end
FROM node:16 AS front-end
WORKDIR /app/ui-build
COPY ui/package.json ui/package-lock.json ./
COPY --from=front-end-common ["/app/ui-build/up9-mizu-common-0.0.0.tgz", "."]
RUN npm i
COPY ui .
RUN npm run build
@@ -76,8 +87,8 @@ RUN go build -ldflags="-extldflags=-static -s -w \
-X 'github.com/up9inc/mizu/agent/pkg/version.Ver=${VER}'" -o mizuagent .
# Download Basenine executable, verify the sha1sum
ADD https://github.com/up9inc/basenine/releases/download/v0.6.6/basenine_linux_${GOARCH} ./basenine_linux_${GOARCH}
ADD https://github.com/up9inc/basenine/releases/download/v0.6.6/basenine_linux_${GOARCH}.sha256 ./basenine_linux_${GOARCH}.sha256
ADD https://github.com/up9inc/basenine/releases/download/v0.7.1/basenine_linux_${GOARCH} ./basenine_linux_${GOARCH}
ADD https://github.com/up9inc/basenine/releases/download/v0.7.1/basenine_linux_${GOARCH}.sha256 ./basenine_linux_${GOARCH}.sha256
RUN shasum -a 256 -c basenine_linux_"${GOARCH}".sha256 && \
chmod +x ./basenine_linux_"${GOARCH}" && \

View File

@@ -15,12 +15,13 @@ require (
github.com/go-playground/validator/v10 v10.10.0
github.com/google/uuid v1.3.0
github.com/gorilla/websocket v1.4.2
github.com/jinzhu/copier v0.3.5
github.com/nav-inc/datetime v0.1.3
github.com/op/go-logging v0.0.0-20160315200505-970db520ece7
github.com/orcaman/concurrent-map v1.0.0
github.com/patrickmn/go-cache v2.1.0+incompatible
github.com/stretchr/testify v1.7.0
github.com/up9inc/basenine/client/go v0.0.0-20220326121918-785f3061c8ce
github.com/up9inc/basenine/client/go v0.0.0-20220413023528-c741e4aa1cf2
github.com/up9inc/mizu/shared v0.0.0
github.com/up9inc/mizu/tap v0.0.0
github.com/up9inc/mizu/tap/api v0.0.0

View File

@@ -428,6 +428,8 @@ github.com/imdario/mergo v0.3.12/go.mod h1:jmQim1M+e3UYxmgPu/WyfjB3N3VflVyUjjjwH
github.com/inconshreveable/mousetrap v1.0.0 h1:Z8tu5sraLXCXIcARxBp/8cbvlwVa7Z1NHg9XEKhtSvM=
github.com/inconshreveable/mousetrap v1.0.0/go.mod h1:PxqpIevigyE2G7u3NXJIT2ANytuPF1OarO4DADm73n8=
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
github.com/jinzhu/copier v0.3.5 h1:GlvfUwHk62RokgqVNvYsku0TATCF7bAHVwEXoBh3iJg=
github.com/jinzhu/copier v0.3.5/go.mod h1:DfbEm0FYsaqBcKcFuvmOZb218JkPGtvSHsKg8S8hyyg=
github.com/jonboulle/clockwork v0.1.0/go.mod h1:Ii8DK3G1RaLaWxj9trq07+26W01tbo22gdxWY5EU2bo=
github.com/jonboulle/clockwork v0.2.2/go.mod h1:Pkfl5aHPm1nk2H9h0bjmnJD/BcgbGXUBGnn1kMkgxc8=
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
@@ -681,8 +683,8 @@ github.com/ugorji/go v1.2.6/go.mod h1:anCg0y61KIhDlPZmnH+so+RQbysYVyDko0IMgJv0Nn
github.com/ugorji/go/codec v1.1.7/go.mod h1:Ax+UKWsSmolVDwsd+7N3ZtXu+yMGCf907BLYF3GoBXY=
github.com/ugorji/go/codec v1.2.6 h1:7kbGefxLoDBuYXOms4yD7223OpNMMPNPZxXk5TvFcyQ=
github.com/ugorji/go/codec v1.2.6/go.mod h1:V6TCNZ4PHqoHGFZuSG1W8nrCzzdgA2DozYxWFFpvxTw=
github.com/up9inc/basenine/client/go v0.0.0-20220326121918-785f3061c8ce h1:vMTCpKItc9OyTLJXocNaq2NcBU5EnurJgTVOYb8W8dw=
github.com/up9inc/basenine/client/go v0.0.0-20220326121918-785f3061c8ce/go.mod h1:SvJGPoa/6erhUQV7kvHBwM/0x5LyO6XaG2lUaCaKiUI=
github.com/up9inc/basenine/client/go v0.0.0-20220413023528-c741e4aa1cf2 h1:2Ol+X82EOLac/GGP/oB0ACVP/g2vIJrAuzCjZcn+RRI=
github.com/up9inc/basenine/client/go v0.0.0-20220413023528-c741e4aa1cf2/go.mod h1:SvJGPoa/6erhUQV7kvHBwM/0x5LyO6XaG2lUaCaKiUI=
github.com/vishvananda/netns v0.0.0-20211101163701-50045581ed74 h1:gga7acRE695APm9hlsSMoOoE65U4/TcqNj90mc69Rlg=
github.com/vishvananda/netns v0.0.0-20211101163701-50045581ed74/go.mod h1:DD4vA1DwXk04H54A1oHXtwZmA0grkVMdPxx/VGLCah0=
github.com/wI2L/jsondiff v0.1.1 h1:r2TkoEet7E4JMO5+s1RCY2R0LrNPNHY6hbDeow2hRHw=

View File

@@ -18,6 +18,7 @@ import (
"github.com/gin-gonic/gin"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/agent/pkg/elastic"
"github.com/up9inc/mizu/agent/pkg/entries"
"github.com/up9inc/mizu/agent/pkg/middlewares"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/oas"
@@ -198,7 +199,7 @@ func runInHarReaderMode() {
func enableExpFeatureIfNeeded() {
if config.Config.OAS {
oasGenerator := dependency.GetInstance(dependency.OasGeneratorDependency).(oas.OasGenerator)
oasGenerator.Start()
oasGenerator.Start(nil)
}
if config.Config.ServiceMap {
serviceMapGenerator := dependency.GetInstance(dependency.ServiceMapGeneratorDependency).(servicemap.ServiceMap)
@@ -371,4 +372,7 @@ func handleIncomingMessageAsTapper(socketConnection *websocket.Conn) {
func initializeDependencies() {
dependency.RegisterGenerator(dependency.ServiceMapGeneratorDependency, func() interface{} { return servicemap.GetDefaultServiceMapInstance() })
dependency.RegisterGenerator(dependency.OasGeneratorDependency, func() interface{} { return oas.GetDefaultOasGeneratorInstance() })
dependency.RegisterGenerator(dependency.EntriesProvider, func() interface{} { return &entries.BasenineEntriesProvider{} })
dependency.RegisterGenerator(dependency.EntriesSocketStreamer, func() interface{} { return &api.BasenineEntryStreamer{} })
dependency.RegisterGenerator(dependency.EntryStreamerSocketConnector, func() interface{} { return &api.DefaultEntryStreamerSocketConnector{} })
}

View File

@@ -0,0 +1,57 @@
package api
import (
"fmt"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/shared/logger"
tapApi "github.com/up9inc/mizu/tap/api"
)
type EntryStreamerSocketConnector interface {
SendEntry(socketId int, entry *tapApi.Entry, params *WebSocketParams)
SendMetadata(socketId int, metadata *basenine.Metadata)
SendToastError(socketId int, err error)
CleanupSocket(socketId int)
}
type DefaultEntryStreamerSocketConnector struct{}
func (e *DefaultEntryStreamerSocketConnector) SendEntry(socketId int, entry *tapApi.Entry, params *WebSocketParams) {
var message []byte
if params.EnableFullEntries {
message, _ = models.CreateFullEntryWebSocketMessage(entry)
} else {
extension := extensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
message, _ = models.CreateBaseEntryWebSocketMessage(base)
}
if err := SendToSocket(socketId, message); err != nil {
logger.Log.Error(err)
}
}
func (e *DefaultEntryStreamerSocketConnector) SendMetadata(socketId int, metadata *basenine.Metadata) {
metadataBytes, _ := models.CreateWebsocketQueryMetadataMessage(metadata)
if err := SendToSocket(socketId, metadataBytes); err != nil {
logger.Log.Error(err)
}
}
func (e *DefaultEntryStreamerSocketConnector) SendToastError(socketId int, err error) {
toastBytes, _ := models.CreateWebsocketToastMessage(&models.ToastMessage{
Type: "error",
AutoClose: 5000,
Text: fmt.Sprintf("Syntax error: %s", err.Error()),
})
if err := SendToSocket(socketId, toastBytes); err != nil {
logger.Log.Error(err)
}
}
func (e *DefaultEntryStreamerSocketConnector) CleanupSocket(socketId int) {
socketObj := connectedWebsockets[socketId]
socketCleanup(socketId, socketObj)
}

View File

@@ -11,6 +11,8 @@ import (
"strings"
"time"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/agent/pkg/elastic"
"github.com/up9inc/mizu/agent/pkg/har"
@@ -19,8 +21,6 @@ import (
"github.com/up9inc/mizu/agent/pkg/servicemap"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/oas"
"github.com/up9inc/mizu/agent/pkg/resolver"
"github.com/up9inc/mizu/agent/pkg/utils"
@@ -105,9 +105,11 @@ func startReadingChannel(outputItems <-chan *tapApi.OutputChannelItem, extension
connection, err := basenine.NewConnection(shared.BasenineHost, shared.BaseninePort)
if err != nil {
panic(err)
logger.Log.Panicf("Can't establish a new connection to Basenine server: %v", err)
}
if err = connection.InsertMode(); err != nil {
logger.Log.Panicf("Insert mode call failed: %v", err)
}
connection.InsertMode()
disableOASValidation := false
ctx := context.Background()
@@ -140,20 +142,6 @@ func startReadingChannel(outputItems <-chan *tapApi.OutputChannelItem, extension
rules, _, _ := models.RunValidationRulesState(*harEntry, mizuEntry.Destination.Name)
mizuEntry.Rules = rules
}
entryWSource := oas.EntryWithSource{
Entry: *harEntry,
Source: mizuEntry.Source.Name,
Destination: mizuEntry.Destination.Name,
Id: mizuEntry.Id,
}
if entryWSource.Destination == "" {
entryWSource.Destination = mizuEntry.Destination.IP + ":" + mizuEntry.Destination.Port
}
oasGenerator := dependency.GetInstance(dependency.OasGeneratorDependency).(oas.OasGeneratorSink)
oasGenerator.PushEntry(&entryWSource)
}
data, err := json.Marshal(mizuEntry)
@@ -163,7 +151,9 @@ func startReadingChannel(outputItems <-chan *tapApi.OutputChannelItem, extension
providers.EntryAdded(len(data))
connection.SendText(string(data))
if err = connection.SendText(string(data)); err != nil {
logger.Log.Panicf("An error occured while inserting a new record to database: %v", err)
}
serviceMapGenerator := dependency.GetInstance(dependency.ServiceMapGeneratorDependency).(servicemap.ServiceMapSink)
serviceMapGenerator.NewTCPEntry(mizuEntry.Source, mizuEntry.Destination, &item.Protocol)
@@ -183,6 +173,7 @@ func resolveIP(connectionInfo *tapApi.ConnectionInfo) (resolvedSource string, re
}
} else {
resolvedSource = resolvedSourceObject.FullAddress
namespace = resolvedSourceObject.Namespace
}
unresolvedDestination := fmt.Sprintf("%s:%s", connectionInfo.ServerIP, connectionInfo.ServerPort)
@@ -194,7 +185,11 @@ func resolveIP(connectionInfo *tapApi.ConnectionInfo) (resolvedSource string, re
}
} else {
resolvedDestination = resolvedDestinationObject.FullAddress
namespace = resolvedDestinationObject.Namespace
// Overwrite namespace (if it was set according to the source)
// Only overwrite if non-empty
if resolvedDestinationObject.Namespace != "" {
namespace = resolvedDestinationObject.Namespace
}
}
}
return resolvedSource, resolvedDestination, namespace

View File

@@ -0,0 +1,94 @@
package api
import (
"context"
"encoding/json"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/shared"
"github.com/up9inc/mizu/shared/logger"
tapApi "github.com/up9inc/mizu/tap/api"
)
type EntryStreamer interface {
Get(ctx context.Context, socketId int, params *WebSocketParams) error
}
type BasenineEntryStreamer struct{}
func (e *BasenineEntryStreamer) Get(ctx context.Context, socketId int, params *WebSocketParams) error {
var connection *basenine.Connection
entryStreamerSocketConnector := dependency.GetInstance(dependency.EntryStreamerSocketConnector).(EntryStreamerSocketConnector)
connection, err := basenine.NewConnection(shared.BasenineHost, shared.BaseninePort)
if err != nil {
logger.Log.Errorf("failed to establish a connection to Basenine: %v", err)
entryStreamerSocketConnector.CleanupSocket(socketId)
return err
}
data := make(chan []byte)
meta := make(chan []byte)
query := params.Query
err = basenine.Validate(shared.BasenineHost, shared.BaseninePort, query)
if err != nil {
entryStreamerSocketConnector.SendToastError(socketId, err)
}
handleDataChannel := func(c *basenine.Connection, data chan []byte) {
for {
bytes := <-data
if string(bytes) == basenine.CloseChannel {
return
}
var entry *tapApi.Entry
err = json.Unmarshal(bytes, &entry)
if err != nil {
logger.Log.Debugf("Error unmarshalling entry: %v", err.Error())
continue
}
entryStreamerSocketConnector.SendEntry(socketId, entry, params)
}
}
handleMetaChannel := func(c *basenine.Connection, meta chan []byte) {
for {
bytes := <-meta
if string(bytes) == basenine.CloseChannel {
return
}
var metadata *basenine.Metadata
err = json.Unmarshal(bytes, &metadata)
if err != nil {
logger.Log.Debugf("Error unmarshalling metadata: %v", err.Error())
continue
}
entryStreamerSocketConnector.SendMetadata(socketId, metadata)
}
}
go handleDataChannel(connection, data)
go handleMetaChannel(connection, meta)
if err = connection.Query(query, data, meta); err != nil {
logger.Log.Panicf("Query mode call failed: %v", err)
}
go func() {
<-ctx.Done()
data <- []byte(basenine.CloseChannel)
meta <- []byte(basenine.CloseChannel)
connection.Close()
}()
return nil
}

View File

@@ -1,19 +1,15 @@
package api
import (
"encoding/json"
"fmt"
"net/http"
"sync"
"time"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/utils"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/shared"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/utils"
"github.com/up9inc/mizu/shared/logger"
tapApi "github.com/up9inc/mizu/tap/api"
)
@@ -25,9 +21,9 @@ func InitExtensionsMap(ref map[string]*tapApi.Extension) {
}
type EventHandlers interface {
WebSocketConnect(socketId int, isTapper bool)
WebSocketConnect(c *gin.Context, socketId int, isTapper bool)
WebSocketDisconnect(socketId int, isTapper bool)
WebSocketMessage(socketId int, message []byte)
WebSocketMessage(socketId int, isTapper bool, message []byte)
}
type SocketConnection struct {
@@ -62,11 +58,11 @@ func init() {
func WebSocketRoutes(app *gin.Engine, eventHandlers EventHandlers) {
SocketGetBrowserHandler = func(c *gin.Context) {
websocketHandler(c.Writer, c.Request, eventHandlers, false)
websocketHandler(c, eventHandlers, false)
}
SocketGetTapperHandler = func(c *gin.Context) {
websocketHandler(c.Writer, c.Request, eventHandlers, true)
websocketHandler(c, eventHandlers, true)
}
app.GET("/ws", func(c *gin.Context) {
@@ -78,10 +74,10 @@ func WebSocketRoutes(app *gin.Engine, eventHandlers EventHandlers) {
})
}
func websocketHandler(w http.ResponseWriter, r *http.Request, eventHandlers EventHandlers, isTapper bool) {
ws, err := websocketUpgrader.Upgrade(w, r, nil)
func websocketHandler(c *gin.Context, eventHandlers EventHandlers, isTapper bool) {
ws, err := websocketUpgrader.Upgrade(c.Writer, c.Request, nil)
if err != nil {
logger.Log.Errorf("Failed to set websocket upgrade: %v", err)
logger.Log.Errorf("failed to set websocket upgrade: %v", err)
return
}
@@ -93,30 +89,11 @@ func websocketHandler(w http.ResponseWriter, r *http.Request, eventHandlers Even
websocketIdsLock.Unlock()
var connection *basenine.Connection
var isQuerySet bool
// `!isTapper` means it's a connection from the web UI
if !isTapper {
connection, err = basenine.NewConnection(shared.BasenineHost, shared.BaseninePort)
if err != nil {
logger.Log.Errorf("Failed to establish a connection to Basenine: %v", err)
socketCleanup(socketId, connectedWebsockets[socketId])
return
}
}
data := make(chan []byte)
meta := make(chan []byte)
defer func() {
socketCleanup(socketId, connectedWebsockets[socketId])
data <- []byte(basenine.CloseChannel)
meta <- []byte(basenine.CloseChannel)
connection.Close()
}()
eventHandlers.WebSocketConnect(socketId, isTapper)
eventHandlers.WebSocketConnect(c, socketId, isTapper)
startTimeBytes, _ := models.CreateWebsocketStartTimeMessage(utils.StartTime)
@@ -124,127 +101,32 @@ func websocketHandler(w http.ResponseWriter, r *http.Request, eventHandlers Even
logger.Log.Error(err)
}
var params WebSocketParams
for {
_, msg, err := ws.ReadMessage()
if err != nil {
if _, ok := err.(*websocket.CloseError); ok {
logger.Log.Debugf("Received websocket close message, socket id: %d", socketId)
logger.Log.Debugf("received websocket close message, socket id: %d", socketId)
} else {
logger.Log.Errorf("Error reading message, socket id: %d, error: %v", socketId, err)
logger.Log.Errorf("error reading message, socket id: %d, error: %v", socketId, err)
}
break
}
if !isTapper && !isQuerySet {
if err := json.Unmarshal(msg, &params); err != nil {
logger.Log.Errorf("Error unmarshalling parameters: %v", socketId, err)
continue
}
query := params.Query
err = basenine.Validate(shared.BasenineHost, shared.BaseninePort, query)
if err != nil {
toastBytes, _ := models.CreateWebsocketToastMessage(&models.ToastMessage{
Type: "error",
AutoClose: 5000,
Text: fmt.Sprintf("Syntax error: %s", err.Error()),
})
if err := SendToSocket(socketId, toastBytes); err != nil {
logger.Log.Error(err)
}
break
}
isQuerySet = true
handleDataChannel := func(c *basenine.Connection, data chan []byte) {
for {
bytes := <-data
if string(bytes) == basenine.CloseChannel {
return
}
var entry *tapApi.Entry
err = json.Unmarshal(bytes, &entry)
if err != nil {
logger.Log.Debugf("Error unmarshalling entry: %v", err.Error())
continue
}
var message []byte
if params.EnableFullEntries {
message, _ = models.CreateFullEntryWebSocketMessage(entry)
} else {
extension := extensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
message, _ = models.CreateBaseEntryWebSocketMessage(base)
}
if err := SendToSocket(socketId, message); err != nil {
logger.Log.Error(err)
}
}
}
handleMetaChannel := func(c *basenine.Connection, meta chan []byte) {
for {
bytes := <-meta
if string(bytes) == basenine.CloseChannel {
return
}
var metadata *basenine.Metadata
err = json.Unmarshal(bytes, &metadata)
if err != nil {
logger.Log.Debugf("Error unmarshalling metadata: %v", err.Error())
continue
}
metadataBytes, _ := models.CreateWebsocketQueryMetadataMessage(metadata)
if err := SendToSocket(socketId, metadataBytes); err != nil {
logger.Log.Error(err)
}
}
}
go handleDataChannel(connection, data)
go handleMetaChannel(connection, meta)
connection.Query(query, data, meta)
} else {
eventHandlers.WebSocketMessage(socketId, msg)
}
eventHandlers.WebSocketMessage(socketId, isTapper, msg)
}
}
func socketCleanup(socketId int, socketConnection *SocketConnection) {
err := socketConnection.connection.Close()
if err != nil {
logger.Log.Errorf("Error closing socket connection for socket id %d: %v", socketId, err)
}
websocketIdsLock.Lock()
connectedWebsockets[socketId] = nil
websocketIdsLock.Unlock()
socketConnection.eventHandlers.WebSocketDisconnect(socketId, socketConnection.isTapper)
}
func SendToSocket(socketId int, message []byte) error {
socketObj := connectedWebsockets[socketId]
if socketObj == nil {
return fmt.Errorf("Socket %v is disconnected", socketId)
return fmt.Errorf("socket %v is disconnected", socketId)
}
var sent = false
time.AfterFunc(time.Second*5, func() {
if !sent {
logger.Log.Error("Socket timed out")
logger.Log.Error("socket timed out")
socketCleanup(socketId, socketObj)
}
})
@@ -255,7 +137,20 @@ func SendToSocket(socketId int, message []byte) error {
sent = true
if err != nil {
return fmt.Errorf("Failed to write message to socket %v, err: %w", socketId, err)
return fmt.Errorf("failed to write message to socket %v, err: %w", socketId, err)
}
return nil
}
func socketCleanup(socketId int, socketConnection *SocketConnection) {
err := socketConnection.connection.Close()
if err != nil {
logger.Log.Errorf("error closing socket connection for socket id %d: %v", socketId, err)
}
websocketIdsLock.Lock()
connectedWebsockets[socketId] = nil
websocketIdsLock.Unlock()
socketConnection.eventHandlers.WebSocketDisconnect(socketId, socketConnection.isTapper)
}

View File

@@ -1,12 +1,13 @@
package api
import (
"context"
"encoding/json"
"fmt"
"sync"
"github.com/gin-gonic/gin"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/providers"
"github.com/up9inc/mizu/agent/pkg/providers/tappedPods"
"github.com/up9inc/mizu/agent/pkg/providers/tappers"
"github.com/up9inc/mizu/agent/pkg/up9"
@@ -17,7 +18,11 @@ import (
"github.com/up9inc/mizu/shared/logger"
)
var browserClientSocketUUIDs = make([]int, 0)
type BrowserClient struct {
dataStreamCancelFunc context.CancelFunc
}
var browserClients = make(map[int]*BrowserClient, 0)
var tapperClientSocketUUIDs = make([]int, 0)
var socketListLock = sync.Mutex{}
@@ -30,7 +35,7 @@ func init() {
go up9.UpdateAnalyzeStatus(BroadcastToBrowserClients)
}
func (h *RoutesEventHandlers) WebSocketConnect(socketId int, isTapper bool) {
func (h *RoutesEventHandlers) WebSocketConnect(_ *gin.Context, socketId int, isTapper bool) {
if isTapper {
logger.Log.Infof("Websocket event - Tapper connected, socket ID: %d", socketId)
tappers.Connected()
@@ -45,7 +50,7 @@ func (h *RoutesEventHandlers) WebSocketConnect(socketId int, isTapper bool) {
logger.Log.Infof("Websocket event - Browser socket connected, socket ID: %d", socketId)
socketListLock.Lock()
browserClientSocketUUIDs = append(browserClientSocketUUIDs, socketId)
browserClients[socketId] = &BrowserClient{}
socketListLock.Unlock()
BroadcastTappedPodsStatus()
@@ -63,13 +68,16 @@ func (h *RoutesEventHandlers) WebSocketDisconnect(socketId int, isTapper bool) {
} else {
logger.Log.Infof("Websocket event - Browser socket disconnected, socket ID: %d", socketId)
socketListLock.Lock()
removeSocketUUIDFromBrowserSlice(socketId)
if browserClients[socketId] != nil && browserClients[socketId].dataStreamCancelFunc != nil {
browserClients[socketId].dataStreamCancelFunc()
}
delete(browserClients, socketId)
socketListLock.Unlock()
}
}
func BroadcastToBrowserClients(message []byte) {
for _, socketId := range browserClientSocketUUIDs {
for socketId := range browserClients {
go func(socketId int) {
if err := SendToSocket(socketId, message); err != nil {
logger.Log.Error(err)
@@ -88,7 +96,33 @@ func BroadcastToTapperClients(message []byte) {
}
}
func (h *RoutesEventHandlers) WebSocketMessage(_ int, message []byte) {
func (h *RoutesEventHandlers) WebSocketMessage(socketId int, isTapper bool, message []byte) {
if isTapper {
HandleTapperIncomingMessage(message, h.SocketOutChannel, BroadcastToBrowserClients)
} else {
// we initiate the basenine stream after the first websocket message we receive (it contains the entry query), we then store a cancelfunc to later cancel this stream
if browserClients[socketId] != nil && browserClients[socketId].dataStreamCancelFunc == nil {
var params WebSocketParams
if err := json.Unmarshal(message, &params); err != nil {
logger.Log.Errorf("Error: %v", socketId, err)
return
}
entriesStreamer := dependency.GetInstance(dependency.EntriesSocketStreamer).(EntryStreamer)
ctx, cancelFunc := context.WithCancel(context.Background())
err := entriesStreamer.Get(ctx, socketId, &params)
if err != nil {
logger.Log.Errorf("error initializing basenine stream for browser socket %d %+v", socketId, err)
cancelFunc()
} else {
browserClients[socketId].dataStreamCancelFunc = cancelFunc
}
}
}
}
func HandleTapperIncomingMessage(message []byte, socketOutChannel chan<- *tapApi.OutputChannelItem, broadcastMessageFunc func([]byte)) {
var socketMessageBase shared.WebSocketMessageMetadata
err := json.Unmarshal(message, &socketMessageBase)
if err != nil {
@@ -102,7 +136,7 @@ func (h *RoutesEventHandlers) WebSocketMessage(_ int, message []byte) {
logger.Log.Infof("Could not unmarshal message of message type %s %v", socketMessageBase.MessageType, err)
} else {
// NOTE: This is where the message comes back from the intermediate WebSocket to code.
h.SocketOutChannel <- tappedEntryMessage.Data
socketOutChannel <- tappedEntryMessage.Data
}
case shared.WebSocketMessageTypeUpdateStatus:
var statusMessage shared.WebSocketStatusMessage
@@ -110,15 +144,7 @@ func (h *RoutesEventHandlers) WebSocketMessage(_ int, message []byte) {
if err != nil {
logger.Log.Infof("Could not unmarshal message of message type %s %v", socketMessageBase.MessageType, err)
} else {
BroadcastToBrowserClients(message)
}
case shared.WebsocketMessageTypeOutboundLink:
var outboundLinkMessage models.WebsocketOutboundLinkMessage
err := json.Unmarshal(message, &outboundLinkMessage)
if err != nil {
logger.Log.Infof("Could not unmarshal message of message type %s %v", socketMessageBase.MessageType, err)
} else {
handleTLSLink(outboundLinkMessage)
broadcastMessageFunc(message)
}
default:
logger.Log.Infof("Received socket message of type %s for which no handlers are defined", socketMessageBase.MessageType)
@@ -126,39 +152,6 @@ func (h *RoutesEventHandlers) WebSocketMessage(_ int, message []byte) {
}
}
func handleTLSLink(outboundLinkMessage models.WebsocketOutboundLinkMessage) {
resolvedNameObject := k8sResolver.Resolve(outboundLinkMessage.Data.DstIP)
if resolvedNameObject != nil {
outboundLinkMessage.Data.DstIP = resolvedNameObject.FullAddress
} else if outboundLinkMessage.Data.SuggestedResolvedName != "" {
outboundLinkMessage.Data.DstIP = outboundLinkMessage.Data.SuggestedResolvedName
}
cacheKey := fmt.Sprintf("%s -> %s:%d", outboundLinkMessage.Data.Src, outboundLinkMessage.Data.DstIP, outboundLinkMessage.Data.DstPort)
_, isInCache := providers.RecentTLSLinks.Get(cacheKey)
if isInCache {
return
} else {
providers.RecentTLSLinks.SetDefault(cacheKey, outboundLinkMessage.Data)
}
marshaledMessage, err := json.Marshal(outboundLinkMessage)
if err != nil {
logger.Log.Errorf("Error marshaling outbound link message for broadcasting: %v", err)
} else {
logger.Log.Errorf("Broadcasting outboundlink message %s", string(marshaledMessage))
BroadcastToBrowserClients(marshaledMessage)
}
}
func removeSocketUUIDFromBrowserSlice(uuidToRemove int) {
newUUIDSlice := make([]int, 0, len(browserClientSocketUUIDs))
for _, uuid := range browserClientSocketUUIDs {
if uuid != uuidToRemove {
newUUIDSlice = append(newUUIDSlice, uuid)
}
}
browserClientSocketUUIDs = newUUIDSlice
}
func removeSocketUUIDFromTapperSlice(uuidToRemove int) {
newUUIDSlice := make([]int, 0, len(tapperClientSocketUUIDs))
for _, uuid := range tapperClientSocketUUIDs {

View File

@@ -67,7 +67,7 @@ func ConfigureBasenineServer(host string, port string, dbSize int64, logLevel lo
wait.WithProto("tcp"),
wait.WithWait(200*time.Millisecond),
wait.WithBreak(50*time.Millisecond),
wait.WithDeadline(5*time.Second),
wait.WithDeadline(20*time.Second),
wait.WithDebug(logLevel == logging.DEBUG),
).Do([]string{fmt.Sprintf("%s:%s", host, port)}) {
logger.Log.Panicf("Basenine is not available!")

View File

@@ -1,25 +1,19 @@
package controllers
import (
"encoding/json"
"net/http"
"strconv"
"time"
"github.com/up9inc/mizu/agent/pkg/app"
"github.com/up9inc/mizu/agent/pkg/har"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/agent/pkg/entries"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/agent/pkg/validation"
"github.com/gin-gonic/gin"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/shared"
"github.com/up9inc/mizu/shared/logger"
tapApi "github.com/up9inc/mizu/tap/api"
)
func Error(c *gin.Context, err error) bool {
func HandleEntriesError(c *gin.Context, err error) bool {
if err != nil {
logger.Log.Errorf("Error getting entry: %v", err)
_ = c.Error(err)
@@ -49,45 +43,18 @@ func GetEntries(c *gin.Context) {
entriesRequest.TimeoutMs = 3000
}
data, meta, err := basenine.Fetch(shared.BasenineHost, shared.BaseninePort,
entriesRequest.LeftOff, entriesRequest.Direction, entriesRequest.Query,
entriesRequest.Limit, time.Duration(entriesRequest.TimeoutMs)*time.Millisecond)
if err != nil {
c.JSON(http.StatusInternalServerError, validationError)
}
response := &models.EntriesResponse{}
var dataSlice []interface{}
for _, row := range data {
var entry *tapApi.Entry
err = json.Unmarshal(row, &entry)
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{
"error": true,
"type": "error",
"autoClose": "5000",
"msg": string(row),
})
return // exit
entriesProvider := dependency.GetInstance(dependency.EntriesProvider).(entries.EntriesProvider)
entries, metadata, err := entriesProvider.GetEntries(entriesRequest)
if !HandleEntriesError(c, err) {
baseEntries := make([]interface{}, 0)
for _, entry := range entries {
baseEntries = append(baseEntries, entry.Base)
}
extension := app.ExtensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
dataSlice = append(dataSlice, base)
c.JSON(http.StatusOK, models.EntriesResponse{
Data: baseEntries,
Meta: metadata,
})
}
var metadata *basenine.Metadata
err = json.Unmarshal(meta, &metadata)
if err != nil {
logger.Log.Debugf("Error recieving metadata: %v", err.Error())
}
response.Data = dataSlice
response.Meta = metadata
c.JSON(http.StatusOK, response)
}
func GetEntry(c *gin.Context) {
@@ -101,55 +68,12 @@ func GetEntry(c *gin.Context) {
c.JSON(http.StatusBadRequest, validationError)
}
id, _ := strconv.Atoi(c.Param("id"))
var entry *tapApi.Entry
bytes, err := basenine.Single(shared.BasenineHost, shared.BaseninePort, id, singleEntryRequest.Query)
if Error(c, err) {
return // exit
}
err = json.Unmarshal(bytes, &entry)
if err != nil {
c.JSON(http.StatusNotFound, gin.H{
"error": true,
"type": "error",
"autoClose": "5000",
"msg": string(bytes),
})
return // exit
}
id := c.Param("id")
extension := app.ExtensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
var representation []byte
representation, err = extension.Dissector.Represent(entry.Request, entry.Response)
if err != nil {
c.JSON(http.StatusNotFound, gin.H{
"error": true,
"type": "error",
"autoClose": "5000",
"msg": err.Error(),
})
return // exit
}
entriesProvider := dependency.GetInstance(dependency.EntriesProvider).(entries.EntriesProvider)
entry, err := entriesProvider.GetEntry(singleEntryRequest, id)
var rules []map[string]interface{}
var isRulesEnabled bool
if entry.Protocol.Name == "http" {
harEntry, _ := har.NewEntry(entry.Request, entry.Response, entry.StartTime, entry.ElapsedTime)
_, rulesMatched, _isRulesEnabled := models.RunValidationRulesState(*harEntry, entry.Destination.Name)
isRulesEnabled = _isRulesEnabled
inrec, _ := json.Marshal(rulesMatched)
if err := json.Unmarshal(inrec, &rules); err != nil {
logger.Log.Error(err)
}
if !HandleEntriesError(c, err) {
c.JSON(http.StatusOK, entry)
}
c.JSON(http.StatusOK, tapApi.EntryWrapper{
Protocol: entry.Protocol,
Representation: string(representation),
Data: entry,
Base: base,
Rules: rules,
IsRulesEnabled: isRulesEnabled,
})
}

View File

@@ -1,8 +1,12 @@
package controllers
import (
"bytes"
basenine "github.com/up9inc/basenine/client/go"
"net"
"net/http/httptest"
"testing"
"time"
"github.com/up9inc/mizu/agent/pkg/dependency"
"github.com/up9inc/mizu/agent/pkg/oas"
@@ -11,39 +15,55 @@ import (
)
func TestGetOASServers(t *testing.T) {
dependency.RegisterGenerator(dependency.OasGeneratorDependency, func() interface{} { return oas.GetDefaultOasGeneratorInstance() })
recorder := httptest.NewRecorder()
c, _ := gin.CreateTestContext(recorder)
oas.GetDefaultOasGeneratorInstance().Start()
oas.GetDefaultOasGeneratorInstance().GetServiceSpecs().Store("some", oas.NewGen("some"))
recorder, c := getRecorderAndContext()
GetOASServers(c)
t.Logf("Written body: %s", recorder.Body.String())
}
func TestGetOASAllSpecs(t *testing.T) {
dependency.RegisterGenerator(dependency.OasGeneratorDependency, func() interface{} { return oas.GetDefaultOasGeneratorInstance() })
recorder := httptest.NewRecorder()
c, _ := gin.CreateTestContext(recorder)
oas.GetDefaultOasGeneratorInstance().Start()
oas.GetDefaultOasGeneratorInstance().GetServiceSpecs().Store("some", oas.NewGen("some"))
recorder, c := getRecorderAndContext()
GetOASAllSpecs(c)
t.Logf("Written body: %s", recorder.Body.String())
}
func TestGetOASSpec(t *testing.T) {
dependency.RegisterGenerator(dependency.OasGeneratorDependency, func() interface{} { return oas.GetDefaultOasGeneratorInstance() })
recorder := httptest.NewRecorder()
c, _ := gin.CreateTestContext(recorder)
oas.GetDefaultOasGeneratorInstance().Start()
oas.GetDefaultOasGeneratorInstance().GetServiceSpecs().Store("some", oas.NewGen("some"))
recorder, c := getRecorderAndContext()
c.Params = []gin.Param{{Key: "id", Value: "some"}}
GetOASSpec(c)
t.Logf("Written body: %s", recorder.Body.String())
}
type fakeConn struct {
sendBuffer *bytes.Buffer
receiveBuffer *bytes.Buffer
}
func (f fakeConn) Read(p []byte) (int, error) { return f.sendBuffer.Read(p) }
func (f fakeConn) Write(p []byte) (int, error) { return f.receiveBuffer.Write(p) }
func (fakeConn) Close() error { return nil }
func (fakeConn) LocalAddr() net.Addr { return nil }
func (fakeConn) RemoteAddr() net.Addr { return nil }
func (fakeConn) SetDeadline(t time.Time) error { return nil }
func (fakeConn) SetReadDeadline(t time.Time) error { return nil }
func (fakeConn) SetWriteDeadline(t time.Time) error { return nil }
func getRecorderAndContext() (*httptest.ResponseRecorder, *gin.Context) {
dummyConn := new(basenine.Connection)
dummyConn.Conn = fakeConn{
sendBuffer: bytes.NewBufferString("\n"),
receiveBuffer: bytes.NewBufferString("\n"),
}
dependency.RegisterGenerator(dependency.OasGeneratorDependency, func() interface{} {
return oas.GetDefaultOasGeneratorInstance()
})
recorder := httptest.NewRecorder()
c, _ := gin.CreateTestContext(recorder)
oas.GetDefaultOasGeneratorInstance().Start(dummyConn)
oas.GetDefaultOasGeneratorInstance().GetServiceSpecs().Store("some", oas.NewGen("some"))
return recorder, c
}

View File

@@ -101,16 +101,18 @@ func (s *ServiceMapControllerSuite) TestGet() {
// response nodes
aNode := servicemap.ServiceMapNode{
Id: 1,
Name: TCPEntryA.Name,
Entry: TCPEntryA,
Count: 1,
Id: 1,
Name: TCPEntryA.Name,
Entry: TCPEntryA,
Resolved: true,
Count: 1,
}
bNode := servicemap.ServiceMapNode{
Id: 2,
Name: TCPEntryB.Name,
Entry: TCPEntryB,
Count: 1,
Id: 2,
Name: TCPEntryB.Name,
Entry: TCPEntryB,
Resolved: true,
Count: 1,
}
assert.Contains(response.Nodes, aNode)
assert.Contains(response.Nodes, bNode)

View File

@@ -5,4 +5,7 @@ type DependencyContainerType string
const (
ServiceMapGeneratorDependency = "ServiceMapGeneratorDependency"
OasGeneratorDependency = "OasGeneratorDependency"
EntriesProvider = "EntriesProvider"
EntriesSocketStreamer = "EntriesSocketStreamer"
EntryStreamerSocketConnector = "EntryStreamerSocketConnector"
)

View File

@@ -0,0 +1,98 @@
package entries
import (
"encoding/json"
"time"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/agent/pkg/app"
"github.com/up9inc/mizu/agent/pkg/har"
"github.com/up9inc/mizu/agent/pkg/models"
"github.com/up9inc/mizu/shared"
"github.com/up9inc/mizu/shared/logger"
tapApi "github.com/up9inc/mizu/tap/api"
)
type EntriesProvider interface {
GetEntries(entriesRequest *models.EntriesRequest) ([]*tapApi.EntryWrapper, *basenine.Metadata, error)
GetEntry(singleEntryRequest *models.SingleEntryRequest, entryId string) (*tapApi.EntryWrapper, error)
}
type BasenineEntriesProvider struct{}
func (e *BasenineEntriesProvider) GetEntries(entriesRequest *models.EntriesRequest) ([]*tapApi.EntryWrapper, *basenine.Metadata, error) {
data, meta, err := basenine.Fetch(shared.BasenineHost, shared.BaseninePort,
entriesRequest.LeftOff, entriesRequest.Direction, entriesRequest.Query,
entriesRequest.Limit, time.Duration(entriesRequest.TimeoutMs)*time.Millisecond)
if err != nil {
return nil, nil, err
}
var dataSlice []*tapApi.EntryWrapper
for _, row := range data {
var entry *tapApi.Entry
err = json.Unmarshal(row, &entry)
if err != nil {
return nil, nil, err
}
extension := app.ExtensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
dataSlice = append(dataSlice, &tapApi.EntryWrapper{
Protocol: entry.Protocol,
Data: entry,
Base: base,
})
}
var metadata *basenine.Metadata
err = json.Unmarshal(meta, &metadata)
if err != nil {
logger.Log.Debugf("Error recieving metadata: %v", err.Error())
}
return dataSlice, metadata, nil
}
func (e *BasenineEntriesProvider) GetEntry(singleEntryRequest *models.SingleEntryRequest, entryId string) (*tapApi.EntryWrapper, error) {
var entry *tapApi.Entry
bytes, err := basenine.Single(shared.BasenineHost, shared.BaseninePort, entryId, singleEntryRequest.Query)
if err != nil {
return nil, err
}
err = json.Unmarshal(bytes, &entry)
if err != nil {
return nil, err
}
extension := app.ExtensionsMap[entry.Protocol.Name]
base := extension.Dissector.Summarize(entry)
var representation []byte
representation, err = extension.Dissector.Represent(entry.Request, entry.Response)
if err != nil {
return nil, err
}
var rules []map[string]interface{}
var isRulesEnabled bool
if entry.Protocol.Name == "http" {
harEntry, _ := har.NewEntry(entry.Request, entry.Response, entry.StartTime, entry.ElapsedTime)
_, rulesMatched, _isRulesEnabled := models.RunValidationRulesState(*harEntry, entry.Destination.Name)
isRulesEnabled = _isRulesEnabled
inrec, _ := json.Marshal(rulesMatched)
if err := json.Unmarshal(inrec, &rules); err != nil {
logger.Log.Error(err)
}
}
return &tapApi.EntryWrapper{
Protocol: entry.Protocol,
Representation: string(representation),
Data: entry,
Base: base,
Rules: rules,
IsRulesEnabled: isRulesEnabled,
}, nil
}

View File

@@ -13,7 +13,7 @@ import (
)
type EntriesRequest struct {
LeftOff int `form:"leftOff" validate:"required,min=-1"`
LeftOff string `form:"leftOff" validate:"required"`
Direction int `form:"direction" validate:"required,oneof='1' '-1'"`
Query string `form:"query"`
Limit int `form:"limit" validate:"required,min=1"`

View File

@@ -4,6 +4,7 @@ import (
"bufio"
"encoding/json"
"errors"
"fmt"
"io"
"io/ioutil"
"net/url"
@@ -67,23 +68,23 @@ func fileSize(fname string) int64 {
return fi.Size()
}
func feedEntries(fromFiles []string, isSync bool) (count int, err error) {
func feedEntries(fromFiles []string, isSync bool, gen *defaultOasGenerator) (count uint, err error) {
badFiles := make([]string, 0)
cnt := 0
cnt := uint(0)
for _, file := range fromFiles {
logger.Log.Info("Processing file: " + file)
ext := strings.ToLower(filepath.Ext(file))
eCnt := 0
eCnt := uint(0)
switch ext {
case ".har":
eCnt, err = feedFromHAR(file, isSync)
eCnt, err = feedFromHAR(file, isSync, gen)
if err != nil {
logger.Log.Warning("Failed processing file: " + err.Error())
badFiles = append(badFiles, file)
continue
}
case ".ldjson":
eCnt, err = feedFromLDJSON(file, isSync)
eCnt, err = feedFromLDJSON(file, isSync, gen)
if err != nil {
logger.Log.Warning("Failed processing file: " + err.Error())
badFiles = append(badFiles, file)
@@ -102,7 +103,7 @@ func feedEntries(fromFiles []string, isSync bool) (count int, err error) {
return cnt, nil
}
func feedFromHAR(file string, isSync bool) (int, error) {
func feedFromHAR(file string, isSync bool, gen *defaultOasGenerator) (uint, error) {
fd, err := os.Open(file)
if err != nil {
panic(err)
@@ -121,16 +122,16 @@ func feedFromHAR(file string, isSync bool) (int, error) {
return 0, err
}
cnt := 0
cnt := uint(0)
for _, entry := range harDoc.Log.Entries {
cnt += 1
feedEntry(&entry, "", isSync, file)
feedEntry(&entry, "", file, gen, fmt.Sprintf("%024d", cnt))
}
return cnt, nil
}
func feedEntry(entry *har.Entry, source string, isSync bool, file string) {
func feedEntry(entry *har.Entry, source string, file string, gen *defaultOasGenerator, cnt string) {
entry.Comment = file
if entry.Response.Status == 302 {
logger.Log.Debugf("Dropped traffic entry due to permanent redirect status: %s", entry.StartedDateTime)
@@ -145,15 +146,11 @@ func feedEntry(entry *har.Entry, source string, isSync bool, file string) {
logger.Log.Errorf("Failed to parse entry URL: %v, err: %v", entry.Request.URL, err)
}
ews := EntryWithSource{Entry: *entry, Source: source, Destination: u.Host, Id: uint(0)}
if isSync {
GetDefaultOasGeneratorInstance().entriesChan <- ews // blocking variant, right?
} else {
GetDefaultOasGeneratorInstance().PushEntry(&ews)
}
ews := EntryWithSource{Entry: *entry, Source: source, Destination: u.Host, Id: cnt}
gen.handleHARWithSource(&ews)
}
func feedFromLDJSON(file string, isSync bool) (int, error) {
func feedFromLDJSON(file string, isSync bool, gen *defaultOasGenerator) (uint, error) {
fd, err := os.Open(file)
if err != nil {
panic(err)
@@ -165,7 +162,7 @@ func feedFromLDJSON(file string, isSync bool) (int, error) {
var meta map[string]interface{}
buf := strings.Builder{}
cnt := 0
cnt := uint(0)
source := ""
for {
substr, isPrefix, err := reader.ReadLine()
@@ -196,7 +193,7 @@ func feedFromLDJSON(file string, isSync bool) (int, error) {
logger.Log.Warningf("Failed decoding entry: %s", line)
} else {
cnt += 1
feedEntry(&entry, source, isSync, file)
feedEntry(&entry, source, file, gen, fmt.Sprintf("%024d", cnt))
}
}
}

View File

@@ -0,0 +1,96 @@
{
"openapi": "3.1.0",
"info": {
"title": "http://carts",
"description": "Mizu observed 3 entries (0 failed), at 2.287 hits/s, average response time is 0.017 seconds",
"version": "1.0"
},
"servers": [
{
"url": "http://carts"
}
],
"paths": {
"/carts/{cartId}/items": {
"get": {
"summary": "/carts/{cartId}/items",
"description": "Mizu observed 3 entries (0 failed), at 2.287 hits/s, average response time is 0.017 seconds",
"operationId": "84c9b926-1f73-4ab4-b381-3c124528959f",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": [
{
"id": "60fe98fb86c0fc000869a90c",
"itemId": "3395a43e-2d88-40de-b95f-e00e1502085b",
"quantity": 1,
"unitPrice": 18
}
],
"x-sample-entry": "000000000000000000000010"
}
},
"x-sample-entry": "000000000000000000000010"
}
},
"x-counters-per-source": {
"some-source": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298058.3798368,
"lastSeen": 1627298065.2397773,
"sumRT": 0.05,
"sumDuration": 6.859940528869629
}
},
"x-counters-total": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298058.3798368,
"lastSeen": 1627298065.2397773,
"sumRT": 0.05,
"sumDuration": 6.859940528869629
},
"x-last-seen-ts": 1627298065.2397773,
"x-sample-entry": "000000000000000000000010"
},
"parameters": [
{
"name": "cartId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "mHK0P7zTktmV1zv57iWAvCTd43FFMHap"
}
},
"x-sample-entry": "000000000000000000000010"
}
]
}
},
"x-counters-per-source": {
"some-source": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298058.3798368,
"lastSeen": 1627298065.2397773,
"sumRT": 0.05,
"sumDuration": 6.859940528869629
}
},
"x-counters-total": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298058.3798368,
"lastSeen": 1627298065.2397773,
"sumRT": 0.05,
"sumDuration": 6.859940528869629
}
}

View File

@@ -0,0 +1,485 @@
{
"openapi": "3.1.0",
"info": {
"title": "Preloaded",
"description": "Test file for loading pre-existing OAS",
"version": "0.1"
},
"paths": {
"/catalogue": {
"get": {
"tags": [
"catalogue"
],
"summary": "/catalogue",
"description": "Mizu observed 3 entries (0 failed), at 2.647 hits/s, average response time is 0.008 seconds",
"operationId": "dd6c3dbe-6b6b-4ddd-baed-757e237ddb8a",
"parameters": [
{
"name": "page",
"in": "query",
"required": false,
"style": "form",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "1"
}
},
"x-sample-entry": "000000000000000000000002"
},
{
"name": "size",
"in": "query",
"required": true,
"style": "form",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "6"
},
"example #1": {
"value": "3"
},
"example #2": {
"value": "5"
}
},
"x-sample-entry": "000000000000000000000011"
},
{
"name": "tags",
"in": "query",
"required": false,
"style": "form",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": ""
},
"example #1": {
"value": "blue"
}
},
"x-sample-entry": "000000000000000000000007"
},
{
"name": "sort",
"in": "query",
"required": false,
"style": "form",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "id"
}
},
"x-sample-entry": "000000000000000000000007"
}
],
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": [
{
"count": 1,
"description": "Socks fit for a Messiah. You too can experience walking in water with these special edition beauties. Each hole is lovingly proggled to leave smooth edges. The only sock approved by a higher power.",
"id": "03fef6ac-1896-4ce8-bd69-b798f85c6e0b",
"imageUrl": [
"/catalogue/images/holy_1.jpeg",
"/catalogue/images/holy_2.jpeg"
],
"name": "Holy",
"price": 99.99,
"tag": [
"action",
"magic"
]
},
{
"count": 438,
"description": "proident occaecat irure et excepteur labore minim nisi amet irure",
"id": "3395a43e-2d88-40de-b95f-e00e1502085b",
"imageUrl": [
"/catalogue/images/colourful_socks.jpg",
"/catalogue/images/colourful_socks.jpg"
],
"name": "Colourful",
"price": 18,
"tag": [
"brown",
"blue"
]
},
{
"count": 820,
"description": "Ready for action. Engineers: be ready to smash that next bug! Be ready, with these super-action-sport-masterpieces. This particular engineer was chased away from the office with a stick.",
"id": "510a0d7e-8e83-4193-b483-e27e09ddc34d",
"imageUrl": [
"/catalogue/images/puma_1.jpeg",
"/catalogue/images/puma_2.jpeg"
],
"name": "SuperSport XL",
"price": 15,
"tag": [
"sport",
"formal",
"black"
]
},
{
"count": 738,
"description": "A mature sock, crossed, with an air of nonchalance.",
"id": "808a2de1-1aaa-4c25-a9b9-6612e8f29a38",
"imageUrl": [
"/catalogue/images/cross_1.jpeg",
"/catalogue/images/cross_2.jpeg"
],
"name": "Crossed",
"price": 17.32,
"tag": [
"blue",
"action",
"red",
"formal"
]
},
{
"count": 808,
"description": "enim officia aliqua excepteur esse deserunt quis aliquip nostrud anim",
"id": "819e1fbf-8b7e-4f6d-811f-693534916a8b",
"imageUrl": [
"/catalogue/images/WAT.jpg",
"/catalogue/images/WAT2.jpg"
],
"name": "Figueroa",
"price": 14,
"tag": [
"green",
"formal",
"blue"
]
},
{
"count": 175,
"description": "consequat amet cupidatat minim laborum tempor elit ex consequat in",
"id": "837ab141-399e-4c1f-9abc-bace40296bac",
"imageUrl": [
"/catalogue/images/catsocks.jpg",
"/catalogue/images/catsocks2.jpg"
],
"name": "Cat socks",
"price": 15,
"tag": [
"brown",
"formal",
"green"
]
}
],
"x-sample-entry": "000000000000000000000011"
}
},
"x-sample-entry": "000000000000000000000011"
}
},
"x-counters-per-source": {
"some-source": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298057.7849188,
"lastSeen": 1627298065.7258668,
"sumRT": 0.024999999999999998,
"sumDuration": 7.940948009490967
}
},
"x-counters-total": {
"entries": 3,
"failures": 0,
"firstSeen": 1627298057.7849188,
"lastSeen": 1627298065.7258668,
"sumRT": 0.024999999999999998,
"sumDuration": 7.940948009490967
},
"x-last-seen-ts": 1627298065.7258668,
"x-sample-entry": "000000000000000000000011"
}
},
"/catalogue/size": {
"get": {
"tags": [
"catalogue"
],
"summary": "/catalogue/size",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.013 seconds",
"operationId": "2315e69d-9d66-48cf-b3d3-fec9c30bd28b",
"parameters": [
{
"name": "tags",
"in": "query",
"required": true,
"style": "form",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": ""
}
},
"x-sample-entry": "000000000000000000000001"
},
{
"name": "x-some",
"in": "header",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "demo val"
}
},
"x-sample-entry": "000000000000000000000001"
}
],
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": {
"err": null,
"size": 9
},
"x-sample-entry": "000000000000000000000001"
}
},
"x-sample-entry": "000000000000000000000001"
}
},
"x-counters-per-source": {
"some-source": {
"entries": 1,
"failures": 0,
"firstSeen": 1627298057.7841518,
"lastSeen": 1627298057.7841518,
"sumRT": 0.013,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1627298057.7841518,
"lastSeen": 1627298057.7841518,
"sumRT": 0.013,
"sumDuration": 0
},
"x-last-seen-ts": 1627298057.7841518,
"x-sample-entry": "000000000000000000000001"
}
},
"/catalogue/{id}": {
"get": {
"tags": [
"catalogue"
],
"summary": "/catalogue/{id}",
"description": "Mizu observed 4 entries (0 failed), at 1.899 hits/s, average response time is 0.003 seconds",
"parameters": [
{
"name": "non-required-header",
"in": "header",
"required": false,
"style": "simple",
"schema": {
"type": "string"
},
"example": "some-uuid-maybe"
},
{
"name": "x-some",
"in": "header",
"required": false,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "demoval"
}
},
"x-sample-entry": "000000000000000000000004"
}
],
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": {
"count": 438,
"description": "proident occaecat irure et excepteur labore minim nisi amet irure",
"id": "3395a43e-2d88-40de-b95f-e00e1502085b",
"imageUrl": [
"/catalogue/images/colourful_socks.jpg",
"/catalogue/images/colourful_socks.jpg"
],
"name": "Colourful",
"price": 18,
"tag": [
"brown",
"blue"
]
},
"x-sample-entry": "000000000000000000000012"
}
},
"x-sample-entry": "000000000000000000000012"
}
},
"x-counters-per-source": {
"some-source": {
"entries": 4,
"failures": 0,
"firstSeen": 1627298058.1315014,
"lastSeen": 1627298065.7293031,
"sumRT": 0.013999999999999999,
"sumDuration": 7.597801685333252
}
},
"x-counters-total": {
"entries": 4,
"failures": 0,
"firstSeen": 1627298058.1315014,
"lastSeen": 1627298065.7293031,
"sumRT": 0.013999999999999999,
"sumDuration": 7.597801685333252
},
"x-last-seen-ts": 1627298065.7293031,
"x-sample-entry": "000000000000000000000012"
},
"parameters": [
{
"name": "id",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "3395a43e-2d88-40de-b95f-e00e1502085b"
},
"example #1": {
"value": "808a2de1-1aaa-4c25-a9b9-6612e8f29a38"
}
},
"example": "some-uuid-maybe",
"x-sample-entry": "000000000000000000000012"
}
]
},
"/catalogue/{id}/details": {
"parameters": [
{
"name": "id",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"example": "some-uuid-maybe"
}
]
},
"/tags": {
"get": {
"summary": "/tags",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.007 seconds",
"operationId": "c4d7d0ed-1a78-4370-a049-efe3abc631a6",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": {
"err": null,
"tags": [
"brown",
"geek",
"formal",
"blue",
"skin",
"red",
"action",
"sport",
"black",
"magic",
"green"
]
},
"x-sample-entry": "000000000000000000000003"
}
},
"x-sample-entry": "000000000000000000000003"
}
},
"x-counters-per-source": {
"some-source": {
"entries": 1,
"failures": 0,
"firstSeen": 1627298057.7841816,
"lastSeen": 1627298057.7841816,
"sumRT": 0.007,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1627298057.7841816,
"lastSeen": 1627298057.7841816,
"sumRT": 0.007,
"sumDuration": 0
},
"x-last-seen-ts": 1627298057.7841816,
"x-sample-entry": "000000000000000000000003"
}
}
},
"x-counters-per-source": {
"some-source": {
"entries": 9,
"failures": 0,
"firstSeen": 1627298057.7841518,
"lastSeen": 1627298065.7293031,
"sumRT": 0.05899999999999999,
"sumDuration": 15.538749694824219
}
},
"x-counters-total": {
"entries": 9,
"failures": 0,
"firstSeen": 1627298057.7841518,
"lastSeen": 1627298065.7293031,
"sumRT": 0.05899999999999999,
"sumDuration": 15.538749694824219
}
}

View File

@@ -0,0 +1,897 @@
{
"openapi": "3.1.0",
"info": {
"title": "https://httpbin.org",
"description": "Mizu observed 19 entries (0 failed), at 0.106 hits/s, average response time is 0.172 seconds",
"version": "1.0"
},
"servers": [
{
"url": "https://httpbin.org"
}
],
"paths": {
"/appears-once": {
"get": {
"summary": "/appears-once",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.630 seconds",
"operationId": "2d34623e-fde8-4720-8390-9a7439051755",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null,
"x-sample-entry": "000000000000000000000004"
}
},
"x-sample-entry": "000000000000000000000004"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750580.0471218,
"lastSeen": 1567750580.0471218,
"sumRT": 0.63,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750580.0471218,
"lastSeen": 1567750580.0471218,
"sumRT": 0.63,
"sumDuration": 0
},
"x-last-seen-ts": 1567750580.0471218,
"x-sample-entry": "000000000000000000000004"
}
},
"/appears-twice": {
"get": {
"summary": "/appears-twice",
"description": "Mizu observed 2 entries (0 failed), at 0.500 hits/s, average response time is 0.630 seconds",
"operationId": "9c5330f3-8062-468b-b5a3-df1ad82b4846",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null,
"x-sample-entry": "000000000000000000000006"
}
},
"x-sample-entry": "000000000000000000000006"
}
},
"x-counters-per-source": {
"": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750580.7471218,
"lastSeen": 1567750581.7471218,
"sumRT": 1.26,
"sumDuration": 1
}
},
"x-counters-total": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750580.7471218,
"lastSeen": 1567750581.7471218,
"sumRT": 1.26,
"sumDuration": 1
},
"x-last-seen-ts": 1567750581.7471218,
"x-sample-entry": "000000000000000000000006"
}
},
"/body-optional": {
"post": {
"summary": "/body-optional",
"description": "Mizu observed 3 entries (0 failed), at 0.003 hits/s, average response time is 0.001 seconds",
"operationId": "34f3d66c-b1f7-4dca-9cab-987fcc8ae472",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000012"
}
},
"x-sample-entry": "000000000000000000000012"
}
},
"x-counters-per-source": {
"": {
"entries": 3,
"failures": 0,
"firstSeen": 1567750581.7471218,
"lastSeen": 1567750581.757122,
"sumRT": 0.003,
"sumDuration": 0.010000228881835938
}
},
"x-counters-total": {
"entries": 3,
"failures": 0,
"firstSeen": 1567750581.7471218,
"lastSeen": 1567750581.757122,
"sumRT": 0.003,
"sumDuration": 0.010000228881835938
},
"x-last-seen-ts": 1567750581.757122,
"x-sample-entry": "000000000000000000000012",
"requestBody": {
"description": "Generic request body",
"content": {
"application/json": {
"example": "{\"key\", \"val\"}",
"x-sample-entry": "000000000000000000000011"
}
},
"x-sample-entry": "000000000000000000000012"
}
}
},
"/body-required": {
"post": {
"summary": "/body-required",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "ff6add53-ab1c-4d4e-b590-0835fa318276",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000013"
}
},
"x-sample-entry": "000000000000000000000013"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750581.757122,
"lastSeen": 1567750581.757122,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750581.757122,
"lastSeen": 1567750581.757122,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750581.757122,
"x-sample-entry": "000000000000000000000013",
"requestBody": {
"description": "Generic request body",
"content": {
"": {
"example": "body exists",
"x-sample-entry": "000000000000000000000013"
}
},
"required": true,
"x-sample-entry": "000000000000000000000013"
}
}
},
"/form-multipart": {
"post": {
"summary": "/form-multipart",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "153f0925-9fc7-4e9f-9d33-f1470f25f0f7",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"example": {},
"x-sample-entry": "000000000000000000000009"
}
},
"x-sample-entry": "000000000000000000000009"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.7471218,
"lastSeen": 1567750582.7471218,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.7471218,
"lastSeen": 1567750582.7471218,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.7471218,
"x-sample-entry": "000000000000000000000009",
"requestBody": {
"description": "Generic request body",
"content": {
"multipart/form-data": {
"schema": {
"type": "object",
"required": [
"file",
"path"
],
"properties": {
"file": {
"type": "string",
"contentMediaType": "application/json",
"examples": [
"{\"functions\": 123}"
]
},
"path": {
"type": "string",
"examples": [
"/content/components"
]
}
}
},
"example": "--BOUNDARY\r\nContent-Disposition: form-data; name=\"file\"; filename=\"metadata.json\"\r\nContent-Type: application/json\r\n\r\n{\"functions\": 123}\r\n--BOUNDARY\r\nContent-Disposition: form-data; name=\"path\"\r\n\r\n/content/components\r\n--BOUNDARY--\r\n",
"x-sample-entry": "000000000000000000000009"
}
},
"required": true,
"x-sample-entry": "000000000000000000000009"
}
}
},
"/form-urlencoded": {
"post": {
"summary": "/form-urlencoded",
"description": "Mizu observed 2 entries (0 failed), at 0.500 hits/s, average response time is 0.001 seconds",
"operationId": "c92189f5-5636-46eb-ac71-92b17941a568",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000008"
}
},
"x-sample-entry": "000000000000000000000008"
}
},
"x-counters-per-source": {
"": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750580.7471218,
"lastSeen": 1567750581.7471218,
"sumRT": 0.002,
"sumDuration": 1
}
},
"x-counters-total": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750580.7471218,
"lastSeen": 1567750581.7471218,
"sumRT": 0.002,
"sumDuration": 1
},
"x-last-seen-ts": 1567750581.7471218,
"x-sample-entry": "000000000000000000000008",
"requestBody": {
"description": "Generic request body",
"content": {
"application/x-www-form-urlencoded": {
"schema": {
"type": "object",
"required": [
"agent-id",
"callback-url",
"token"
],
"properties": {
"agent-id": {
"type": "string",
"examples": [
"ade"
]
},
"callback-url": {
"type": "string",
"examples": [
""
]
},
"optional": {
"type": "string",
"examples": [
"another"
]
},
"token": {
"type": "string",
"examples": [
"sometoken",
"sometoken-second-val"
]
}
}
},
"example": "agent-id=ade\u0026callback-url=\u0026token=sometoken",
"x-sample-entry": "000000000000000000000008"
}
},
"required": true,
"x-sample-entry": "000000000000000000000008"
}
}
},
"/param-patterns/prefix-gibberish-fine/{prefixgibberishfineId}": {
"get": {
"tags": [
"param-patterns"
],
"summary": "/param-patterns/prefix-gibberish-fine/{prefixgibberishfineId}",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "85270437-7aae-4a5b-b988-3662092463d0",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000014"
}
},
"x-sample-entry": "000000000000000000000014"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582,
"lastSeen": 1567750582,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582,
"lastSeen": 1567750582,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750582,
"x-sample-entry": "000000000000000000000014"
},
"parameters": [
{
"name": "prefixgibberishfineId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "234324"
}
},
"x-sample-entry": "000000000000000000000014"
}
]
},
"/param-patterns/{parampatternId}": {
"get": {
"tags": [
"param-patterns"
],
"summary": "/param-patterns/{parampatternId}",
"description": "Mizu observed 2 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "da597734-1cf5-4d3b-917b-6b02dacf7b7b",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000018"
}
},
"x-sample-entry": "000000000000000000000018"
}
},
"x-counters-per-source": {
"": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750582.000003,
"lastSeen": 1567750582.000004,
"sumRT": 0.002,
"sumDuration": 9.5367431640625e-7
}
},
"x-counters-total": {
"entries": 2,
"failures": 0,
"firstSeen": 1567750582.000003,
"lastSeen": 1567750582.000004,
"sumRT": 0.002,
"sumDuration": 9.5367431640625e-7
},
"x-last-seen-ts": 1567750582.000004,
"x-sample-entry": "000000000000000000000018"
},
"parameters": [
{
"name": "parampatternId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string",
"pattern": "^prefix-gibberish-.+"
},
"examples": {
"example #0": {
"value": "prefix-gibberish-sfdlasdfkadf87sd93284q24r"
},
"example #1": {
"value": "prefix-gibberish-adslkfasdf89sa7dfasddafa8a98sd7kansdf"
},
"example #2": {
"value": "prefix-gibberish-4jk5l2345h2452l4352435jlk45"
},
"example #3": {
"value": "prefix-gibberish-84395h2j4k35hj243j5h2kl34h54k"
},
"example #4": {
"value": "prefix-gibberish-afterwards"
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
"/param-patterns/{parampatternId}/1": {
"get": {
"tags": [
"param-patterns"
],
"summary": "/param-patterns/{parampatternId}/1",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "e965a245-9cfc-48ed-94e1-f765eadb3960",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000015"
}
},
"x-sample-entry": "000000000000000000000015"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000001,
"lastSeen": 1567750582.000001,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000001,
"lastSeen": 1567750582.000001,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.000001,
"x-sample-entry": "000000000000000000000015"
},
"parameters": [
{
"name": "parampatternId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string",
"pattern": "^prefix-gibberish-.+"
},
"examples": {
"example #0": {
"value": "prefix-gibberish-sfdlasdfkadf87sd93284q24r"
},
"example #1": {
"value": "prefix-gibberish-adslkfasdf89sa7dfasddafa8a98sd7kansdf"
},
"example #2": {
"value": "prefix-gibberish-4jk5l2345h2452l4352435jlk45"
},
"example #3": {
"value": "prefix-gibberish-84395h2j4k35hj243j5h2kl34h54k"
},
"example #4": {
"value": "prefix-gibberish-afterwards"
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
"/param-patterns/{parampatternId}/static": {
"get": {
"tags": [
"param-patterns"
],
"summary": "/param-patterns/{parampatternId}/static",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "7af420dc-f8b7-450f-8f6f-18b039aa3cde",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000016"
}
},
"x-sample-entry": "000000000000000000000016"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000002,
"lastSeen": 1567750582.000002,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000002,
"lastSeen": 1567750582.000002,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.000002,
"x-sample-entry": "000000000000000000000016"
},
"parameters": [
{
"name": "parampatternId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string",
"pattern": "^prefix-gibberish-.+"
},
"examples": {
"example #0": {
"value": "prefix-gibberish-sfdlasdfkadf87sd93284q24r"
},
"example #1": {
"value": "prefix-gibberish-adslkfasdf89sa7dfasddafa8a98sd7kansdf"
},
"example #2": {
"value": "prefix-gibberish-4jk5l2345h2452l4352435jlk45"
},
"example #3": {
"value": "prefix-gibberish-84395h2j4k35hj243j5h2kl34h54k"
},
"example #4": {
"value": "prefix-gibberish-afterwards"
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
"/param-patterns/{parampatternId}/{param1}": {
"get": {
"tags": [
"param-patterns"
],
"summary": "/param-patterns/{parampatternId}/{param1}",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.001 seconds",
"operationId": "02a1771d-2d50-4a8c-8be2-29c7e59b8435",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"": {
"x-sample-entry": "000000000000000000000019"
}
},
"x-sample-entry": "000000000000000000000019"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000002,
"lastSeen": 1567750582.000002,
"sumRT": 0.001,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750582.000002,
"lastSeen": 1567750582.000002,
"sumRT": 0.001,
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.000002,
"x-sample-entry": "000000000000000000000019"
},
"parameters": [
{
"name": "param1",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "23421"
}
},
"x-sample-entry": "000000000000000000000019"
},
{
"name": "parampatternId",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string",
"pattern": "^prefix-gibberish-.+"
},
"examples": {
"example #0": {
"value": "prefix-gibberish-sfdlasdfkadf87sd93284q24r"
},
"example #1": {
"value": "prefix-gibberish-adslkfasdf89sa7dfasddafa8a98sd7kansdf"
},
"example #2": {
"value": "prefix-gibberish-4jk5l2345h2452l4352435jlk45"
},
"example #3": {
"value": "prefix-gibberish-84395h2j4k35hj243j5h2kl34h54k"
},
"example #4": {
"value": "prefix-gibberish-afterwards"
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
"/{Id}": {
"get": {
"summary": "/{Id}",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.630 seconds",
"operationId": "77ec4910-d47a-46a5-8234-fb80a11034b4",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null,
"x-sample-entry": "000000000000000000000003"
}
},
"x-sample-entry": "000000000000000000000003"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750579.7471218,
"lastSeen": 1567750579.7471218,
"sumRT": 0.63,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750579.7471218,
"lastSeen": 1567750579.7471218,
"sumRT": 0.63,
"sumDuration": 0
},
"x-last-seen-ts": 1567750579.7471218,
"x-sample-entry": "000000000000000000000003"
},
"parameters": [
{
"name": "Id",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "e21f7112-3d3b-4632-9da3-a4af2e0e9166"
},
"example #1": {
"value": "952bea17-3776-11ea-9341-42010a84012a"
}
},
"x-sample-entry": "000000000000000000000003"
}
]
},
"/{Id}/sub1": {
"get": {
"summary": "/{Id}/sub1",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.111 seconds",
"operationId": "198675eb-9faf-407b-83fa-0483a730bbbe",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"text/html": {
"x-sample-entry": "000000000000000000000001"
}
},
"x-sample-entry": "000000000000000000000001"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750483.864529,
"lastSeen": 1567750483.864529,
"sumRT": 0.111,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750483.864529,
"lastSeen": 1567750483.864529,
"sumRT": 0.111,
"sumDuration": 0
},
"x-last-seen-ts": 1567750483.864529,
"x-sample-entry": "000000000000000000000001"
},
"parameters": [
{
"name": "Id",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "e21f7112-3d3b-4632-9da3-a4af2e0e9166"
},
"example #1": {
"value": "952bea17-3776-11ea-9341-42010a84012a"
}
},
"x-sample-entry": "000000000000000000000003"
}
]
},
"/{Id}/sub2": {
"get": {
"summary": "/{Id}/sub2",
"description": "Mizu observed 1 entries (0 failed), at 0.000 hits/s, average response time is 0.630 seconds",
"operationId": "31d880f1-152f-4dd6-84a7-463e13b694a5",
"responses": {
"200": {
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null,
"x-sample-entry": "000000000000000000000002"
}
},
"x-sample-entry": "000000000000000000000002"
}
},
"x-counters-per-source": {
"": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750578.7471218,
"lastSeen": 1567750578.7471218,
"sumRT": 0.63,
"sumDuration": 0
}
},
"x-counters-total": {
"entries": 1,
"failures": 0,
"firstSeen": 1567750578.7471218,
"lastSeen": 1567750578.7471218,
"sumRT": 0.63,
"sumDuration": 0
},
"x-last-seen-ts": 1567750578.7471218,
"x-sample-entry": "000000000000000000000002"
},
"parameters": [
{
"name": "Id",
"in": "path",
"required": true,
"style": "simple",
"schema": {
"type": "string"
},
"examples": {
"example #0": {
"value": "e21f7112-3d3b-4632-9da3-a4af2e0e9166"
},
"example #1": {
"value": "952bea17-3776-11ea-9341-42010a84012a"
}
},
"x-sample-entry": "000000000000000000000003"
}
]
}
},
"x-counters-per-source": {
"": {
"entries": 19,
"failures": 0,
"firstSeen": 1567750483.864529,
"lastSeen": 1567750582.7471218,
"sumRT": 3.273999999999999,
"sumDuration": 2.0100011825561523
}
},
"x-counters-total": {
"entries": 19,
"failures": 0,
"firstSeen": 1567750483.864529,
"lastSeen": 1567750582.7471218,
"sumRT": 3.273999999999999,
"sumDuration": 2.0100011825561523
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,11 @@ import (
"net/url"
"sync"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/agent/pkg/har"
"github.com/up9inc/mizu/shared"
"github.com/up9inc/mizu/tap/api"
"github.com/up9inc/mizu/shared/logger"
)
@@ -15,16 +19,12 @@ var (
instance *defaultOasGenerator
)
type OasGeneratorSink interface {
PushEntry(entryWithSource *EntryWithSource)
}
type OasGenerator interface {
Start()
Start(conn *basenine.Connection)
Stop()
IsStarted() bool
Reset()
GetServiceSpecs() *sync.Map
SetEntriesQuery(query string) bool
}
type defaultOasGenerator struct {
@@ -32,7 +32,9 @@ type defaultOasGenerator struct {
ctx context.Context
cancel context.CancelFunc
serviceSpecs *sync.Map
entriesChan chan EntryWithSource
dbConn *basenine.Connection
dbMutex sync.Mutex
entriesQuery string
}
func GetDefaultOasGeneratorInstance() *defaultOasGenerator {
@@ -43,16 +45,33 @@ func GetDefaultOasGeneratorInstance() *defaultOasGenerator {
return instance
}
func (g *defaultOasGenerator) Start() {
func (g *defaultOasGenerator) Start(conn *basenine.Connection) {
if g.started {
return
}
if g.dbConn == nil {
if conn == nil {
logger.Log.Infof("Creating new DB connection for OAS generator to address %s:%s", shared.BasenineHost, shared.BaseninePort)
newConn, err := basenine.NewConnection(shared.BasenineHost, shared.BaseninePort)
if err != nil {
logger.Log.Error("Error connecting to DB for OAS generator, err: %v", err)
return
}
conn = newConn
}
g.dbConn = conn
}
ctx, cancel := context.WithCancel(context.Background())
g.cancel = cancel
g.ctx = ctx
g.entriesChan = make(chan EntryWithSource, 100) // buffer up to 100 entries for OAS processing
g.serviceSpecs = &sync.Map{}
g.started = true
go g.runGenerator()
}
@@ -60,9 +79,17 @@ func (g *defaultOasGenerator) Stop() {
if !g.started {
return
}
g.cancel()
g.Reset()
g.started = false
g.cancel()
g.reset()
g.dbMutex.Lock()
defer g.dbMutex.Unlock()
if g.dbConn != nil {
g.dbConn.Close()
g.dbConn = nil
}
}
func (g *defaultOasGenerator) IsStarted() bool {
@@ -70,80 +97,130 @@ func (g *defaultOasGenerator) IsStarted() bool {
}
func (g *defaultOasGenerator) runGenerator() {
// Make []byte channels to receive the data and the meta
dataChan := make(chan []byte)
metaChan := make(chan []byte)
g.dbMutex.Lock()
defer g.dbMutex.Unlock()
logger.Log.Infof("Querying DB for OAS generator with query '%s'", g.entriesQuery)
if err := g.dbConn.Query(g.entriesQuery, dataChan, metaChan); err != nil {
logger.Log.Errorf("Query mode call failed: %v", err)
}
for {
select {
case <-g.ctx.Done():
logger.Log.Infof("OAS Generator was canceled")
close(dataChan)
close(metaChan)
return
case entryWithSource, ok := <-g.entriesChan:
case metaBytes, ok := <-metaChan:
if !ok {
logger.Log.Infof("OAS Generator - meta channel closed")
break
}
logger.Log.Debugf("Meta: %s", metaBytes)
case dataBytes, ok := <-dataChan:
if !ok {
logger.Log.Infof("OAS Generator - entries channel closed")
break
}
entry := entryWithSource.Entry
u, err := url.Parse(entry.Request.URL)
logger.Log.Debugf("Data: %s", dataBytes)
e := new(api.Entry)
err := json.Unmarshal(dataBytes, e)
if err != nil {
logger.Log.Errorf("Failed to parse entry URL: %v, err: %v", entry.Request.URL, err)
}
val, found := g.serviceSpecs.Load(entryWithSource.Destination)
var gen *SpecGen
if !found {
gen = NewGen(u.Scheme + "://" + entryWithSource.Destination)
g.serviceSpecs.Store(entryWithSource.Destination, gen)
} else {
gen = val.(*SpecGen)
}
opId, err := gen.feedEntry(entryWithSource)
if err != nil {
txt, suberr := json.Marshal(entry)
if suberr == nil {
logger.Log.Debugf("Problematic entry: %s", txt)
}
logger.Log.Warningf("Failed processing entry: %s", err)
continue
}
logger.Log.Debugf("Handled entry %s as opId: %s", entry.Request.URL, opId) // TODO: set opId back to entry?
g.handleEntry(e)
}
}
}
func (g *defaultOasGenerator) Reset() {
g.serviceSpecs = &sync.Map{}
func (g *defaultOasGenerator) handleEntry(mizuEntry *api.Entry) {
if mizuEntry.Protocol.Name == "http" {
entry, err := har.NewEntry(mizuEntry.Request, mizuEntry.Response, mizuEntry.StartTime, mizuEntry.ElapsedTime)
if err != nil {
logger.Log.Warningf("Failed to turn MizuEntry %d into HAR Entry: %s", mizuEntry.Id, err)
return
}
dest := mizuEntry.Destination.Name
if dest == "" {
dest = mizuEntry.Destination.IP + ":" + mizuEntry.Destination.Port
}
entryWSource := &EntryWithSource{
Entry: *entry,
Source: mizuEntry.Source.Name,
Destination: dest,
Id: mizuEntry.Id,
}
g.handleHARWithSource(entryWSource)
} else {
logger.Log.Debugf("OAS: Unsupported protocol in entry %d: %s", mizuEntry.Id, mizuEntry.Protocol.Name)
}
}
func (g *defaultOasGenerator) PushEntry(entryWithSource *EntryWithSource) {
if !g.started {
func (g *defaultOasGenerator) handleHARWithSource(entryWSource *EntryWithSource) {
entry := entryWSource.Entry
gen := g.getGen(entryWSource.Destination, entry.Request.URL)
opId, err := gen.feedEntry(entryWSource)
if err != nil {
txt, suberr := json.Marshal(entry)
if suberr == nil {
logger.Log.Debugf("Problematic entry: %s", txt)
}
logger.Log.Warningf("Failed processing entry %d: %s", entryWSource.Id, err)
return
}
select {
case g.entriesChan <- *entryWithSource:
default:
logger.Log.Warningf("OAS Generator - entry wasn't sent to channel because the channel has no buffer or there is no receiver")
logger.Log.Debugf("Handled entry %s as opId: %s", entryWSource.Id, opId) // TODO: set opId back to entry?
}
func (g *defaultOasGenerator) getGen(dest string, urlStr string) *SpecGen {
u, err := url.Parse(urlStr)
if err != nil {
logger.Log.Errorf("Failed to parse entry URL: %v, err: %v", urlStr, err)
}
val, found := g.serviceSpecs.Load(dest)
var gen *SpecGen
if !found {
gen = NewGen(u.Scheme + "://" + dest)
g.serviceSpecs.Store(dest, gen)
} else {
gen = val.(*SpecGen)
}
return gen
}
func (g *defaultOasGenerator) reset() {
g.serviceSpecs = &sync.Map{}
}
func (g *defaultOasGenerator) GetServiceSpecs() *sync.Map {
return g.serviceSpecs
}
func (g *defaultOasGenerator) SetEntriesQuery(query string) bool {
changed := g.entriesQuery != query
g.entriesQuery = query
return changed
}
func NewDefaultOasGenerator() *defaultOasGenerator {
return &defaultOasGenerator{
started: false,
ctx: nil,
cancel: nil,
serviceSpecs: nil,
entriesChan: nil,
dbConn: nil,
}
}
type EntryWithSource struct {
Source string
Destination string
Entry har.Entry
Id uint
}

View File

@@ -0,0 +1,46 @@
package oas
import (
"encoding/json"
"github.com/up9inc/mizu/agent/pkg/har"
"testing"
"time"
)
func TestOASGen(t *testing.T) {
gen := new(defaultOasGenerator)
e := new(har.Entry)
err := json.Unmarshal([]byte(`{"startedDateTime": "20000101","request": {"url": "https://host/path", "method": "GET"}, "response": {"status": 200}}`), e)
if err != nil {
panic(err)
}
ews := &EntryWithSource{
Destination: "some",
Entry: *e,
}
dummyConn := GetFakeDBConn(`{"startedDateTime": "20000101","request": {"url": "https://host/path", "method": "GET"}, "response": {"status": 200}}`)
gen.Start(dummyConn)
gen.handleHARWithSource(ews)
g, ok := gen.serviceSpecs.Load("some")
if !ok {
panic("Failed")
}
sg := g.(*SpecGen)
spec, err := sg.GetSpec()
if err != nil {
panic(err)
}
specText, _ := json.Marshal(spec)
t.Log(string(specText))
if !gen.IsStarted() {
t.Errorf("Should be started")
}
time.Sleep(100 * time.Millisecond)
gen.Stop()
}

View File

@@ -3,10 +3,6 @@ package oas
import (
"encoding/json"
"errors"
"github.com/chanced/openapi"
"github.com/google/uuid"
"github.com/nav-inc/datetime"
"github.com/up9inc/mizu/shared/logger"
"io"
"io/ioutil"
"mime"
@@ -18,6 +14,11 @@ import (
"strings"
"sync"
"github.com/chanced/openapi"
"github.com/google/uuid"
"github.com/nav-inc/datetime"
"github.com/up9inc/mizu/shared/logger"
"github.com/up9inc/mizu/agent/pkg/har"
"time"
@@ -28,6 +29,13 @@ const CountersTotal = "x-counters-total"
const CountersPerSource = "x-counters-per-source"
const SampleId = "x-sample-entry"
type EntryWithSource struct {
Source string
Destination string
Entry har.Entry
Id string
}
type reqResp struct { // hello, generics in Go
Req *har.Request
Resp *har.Response
@@ -60,7 +68,7 @@ func (g *SpecGen) StartFromSpec(oas *openapi.OpenAPI) {
g.tree = new(Node)
for pathStr, pathObj := range oas.Paths.Items {
pathSplit := strings.Split(string(pathStr), "/")
g.tree.getOrSet(pathSplit, pathObj)
g.tree.getOrSet(pathSplit, pathObj, "")
// clean "last entry timestamp" markers from the past
for _, pathAndOp := range g.tree.listOps() {
@@ -69,11 +77,11 @@ func (g *SpecGen) StartFromSpec(oas *openapi.OpenAPI) {
}
}
func (g *SpecGen) feedEntry(entryWithSource EntryWithSource) (string, error) {
func (g *SpecGen) feedEntry(entryWithSource *EntryWithSource) (string, error) {
g.lock.Lock()
defer g.lock.Unlock()
opId, err := g.handlePathObj(&entryWithSource)
opId, err := g.handlePathObj(entryWithSource)
if err != nil {
return "", err
}
@@ -219,7 +227,7 @@ func (g *SpecGen) handlePathObj(entryWithSource *EntryWithSource) (string, error
} else {
split = strings.Split(urlParsed.Path, "/")
}
node := g.tree.getOrSet(split, new(openapi.PathObj))
node := g.tree.getOrSet(split, new(openapi.PathObj), entryWithSource.Id)
opObj, err := handleOpObj(entryWithSource, node.pathObj)
if opObj != nil {
@@ -242,12 +250,12 @@ func handleOpObj(entryWithSource *EntryWithSource, pathObj *openapi.PathObj) (*o
return nil, nil
}
err = handleRequest(&entry.Request, opObj, isSuccess)
err = handleRequest(&entry.Request, opObj, isSuccess, entryWithSource.Id)
if err != nil {
return nil, err
}
err = handleResponse(&entry.Response, opObj, isSuccess)
err = handleResponse(&entry.Response, opObj, isSuccess, entryWithSource.Id)
if err != nil {
return nil, err
}
@@ -257,6 +265,8 @@ func handleOpObj(entryWithSource *EntryWithSource, pathObj *openapi.PathObj) (*o
return nil, err
}
setSampleID(&opObj.Extensions, entryWithSource.Id)
return opObj, nil
}
@@ -329,15 +339,10 @@ func handleCounters(opObj *openapi.Operation, success bool, entryWithSource *Ent
return err
}
err = opObj.Extensions.SetExtension(SampleId, entryWithSource.Id)
if err != nil {
return err
}
return nil
}
func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool) error {
func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool, sampleId string) error {
// TODO: we don't handle the situation when header/qstr param can be defined on pathObj level. Also the path param defined on opObj
urlParsed, err := url.Parse(req.URL)
if err != nil {
@@ -361,7 +366,7 @@ func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool) e
IsIgnored: func(name string) bool { return false },
GeneralizeName: func(name string) string { return name },
}
handleNameVals(qstrGW, &opObj.Parameters, false)
handleNameVals(qstrGW, &opObj.Parameters, false, sampleId)
hdrGW := nvParams{
In: openapi.InHeader,
@@ -369,7 +374,7 @@ func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool) e
IsIgnored: isHeaderIgnored,
GeneralizeName: strings.ToLower,
}
handleNameVals(hdrGW, &opObj.Parameters, true)
handleNameVals(hdrGW, &opObj.Parameters, true, sampleId)
if isSuccess {
reqBody, err := getRequestBody(req, opObj)
@@ -378,12 +383,14 @@ func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool) e
}
if reqBody != nil {
setSampleID(&reqBody.Extensions, sampleId)
if req.PostData.Text == "" {
reqBody.Required = false
} else {
reqCtype, _ := getReqCtype(req)
reqMedia, err := fillContent(reqResp{Req: req}, reqBody.Content, reqCtype)
reqMedia, err := fillContent(reqResp{Req: req}, reqBody.Content, reqCtype, sampleId)
if err != nil {
return err
}
@@ -395,18 +402,20 @@ func handleRequest(req *har.Request, opObj *openapi.Operation, isSuccess bool) e
return nil
}
func handleResponse(resp *har.Response, opObj *openapi.Operation, isSuccess bool) error {
func handleResponse(resp *har.Response, opObj *openapi.Operation, isSuccess bool, sampleId string) error {
// TODO: we don't support "default" response
respObj, err := getResponseObj(resp, opObj, isSuccess)
if err != nil {
return err
}
handleRespHeaders(resp.Headers, respObj)
setSampleID(&respObj.Extensions, sampleId)
handleRespHeaders(resp.Headers, respObj, sampleId)
respCtype := getRespCtype(resp)
respContent := respObj.Content
respMedia, err := fillContent(reqResp{Resp: resp}, respContent, respCtype)
respMedia, err := fillContent(reqResp{Resp: resp}, respContent, respCtype, sampleId)
if err != nil {
return err
}
@@ -414,7 +423,7 @@ func handleResponse(resp *har.Response, opObj *openapi.Operation, isSuccess bool
return nil
}
func handleRespHeaders(reqHeaders []har.Header, respObj *openapi.ResponseObj) {
func handleRespHeaders(reqHeaders []har.Header, respObj *openapi.ResponseObj, sampleId string) {
visited := map[string]*openapi.HeaderObj{}
for _, pair := range reqHeaders {
if isHeaderIgnored(pair.Name) {
@@ -436,6 +445,8 @@ func handleRespHeaders(reqHeaders []har.Header, respObj *openapi.ResponseObj) {
logger.Log.Warningf("Failed to add example to a parameter: %s", err)
}
visited[nameGeneral] = param
setSampleID(&param.Extensions, sampleId)
}
// maintain "required" flag
@@ -456,13 +467,15 @@ func handleRespHeaders(reqHeaders []har.Header, respObj *openapi.ResponseObj) {
}
}
func fillContent(reqResp reqResp, respContent openapi.Content, ctype string) (*openapi.MediaType, error) {
func fillContent(reqResp reqResp, respContent openapi.Content, ctype string, sampleId string) (*openapi.MediaType, error) {
content, found := respContent[ctype]
if !found {
respContent[ctype] = &openapi.MediaType{}
content = respContent[ctype]
}
setSampleID(&content.Extensions, sampleId)
var text string
var isBinary bool
if reqResp.Req != nil {
@@ -474,10 +487,10 @@ func fillContent(reqResp reqResp, respContent openapi.Content, ctype string) (*o
if !isBinary && text != "" {
var exampleMsg []byte
// try treating it as json
any, isJSON := anyJSON(text)
anyVal, isJSON := anyJSON(text)
if isJSON {
// re-marshal with forced indent
if msg, err := json.MarshalIndent(any, "", "\t"); err != nil {
if msg, err := json.MarshalIndent(anyVal, "", "\t"); err != nil {
panic("Failed to re-marshal value, super-strange")
} else {
exampleMsg = msg

View File

@@ -1,25 +1,37 @@
package oas
import (
"bytes"
"encoding/json"
"io/ioutil"
"net"
"os"
"regexp"
"strings"
"sync"
"testing"
"time"
"github.com/chanced/openapi"
"github.com/op/go-logging"
"github.com/up9inc/mizu/shared/logger"
"github.com/wI2L/jsondiff"
basenine "github.com/up9inc/basenine/client/go"
"github.com/up9inc/mizu/agent/pkg/har"
)
func GetFakeDBConn(send string) *basenine.Connection {
dummyConn := new(basenine.Connection)
dummyConn.Conn = FakeConn{
sendBuffer: bytes.NewBufferString(send),
receiveBuffer: bytes.NewBufferString(""),
}
return dummyConn
}
// if started via env, write file into subdir
func outputSpec(label string, spec *openapi.OpenAPI, t *testing.T) string {
content, err := json.MarshalIndent(spec, "", "\t")
content, err := json.MarshalIndent(spec, "", " ")
if err != nil {
panic(err)
}
@@ -42,20 +54,22 @@ func outputSpec(label string, spec *openapi.OpenAPI, t *testing.T) string {
}
func TestEntries(t *testing.T) {
logger.InitLoggerStd(logging.INFO)
//logger.InitLoggerStd(logging.INFO) causes race condition
files, err := getFiles("./test_artifacts/")
if err != nil {
t.Log(err)
t.FailNow()
}
GetDefaultOasGeneratorInstance().Start()
loadStartingOAS("test_artifacts/catalogue.json", "catalogue")
loadStartingOAS("test_artifacts/trcc.json", "trcc-api-service")
gen := NewDefaultOasGenerator()
gen.serviceSpecs = new(sync.Map)
loadStartingOAS("test_artifacts/catalogue.json", "catalogue", gen.serviceSpecs)
loadStartingOAS("test_artifacts/trcc.json", "trcc-api-service", gen.serviceSpecs)
go func() {
for {
time.Sleep(1 * time.Second)
GetDefaultOasGeneratorInstance().GetServiceSpecs().Range(func(key, val interface{}) bool {
gen.serviceSpecs.Range(func(key, val interface{}) bool {
svc := key.(string)
t.Logf("Getting spec for %s", svc)
gen := val.(*SpecGen)
@@ -68,16 +82,14 @@ func TestEntries(t *testing.T) {
}
}()
cnt, err := feedEntries(files, true)
cnt, err := feedEntries(files, true, gen)
if err != nil {
t.Log(err)
t.Fail()
}
waitQueueProcessed()
svcs := strings.Builder{}
GetDefaultOasGeneratorInstance().GetServiceSpecs().Range(func(key, val interface{}) bool {
gen.serviceSpecs.Range(func(key, val interface{}) bool {
gen := val.(*SpecGen)
svc := key.(string)
svcs.WriteString(svc + ",")
@@ -99,7 +111,7 @@ func TestEntries(t *testing.T) {
return true
})
GetDefaultOasGeneratorInstance().GetServiceSpecs().Range(func(key, val interface{}) bool {
gen.serviceSpecs.Range(func(key, val interface{}) bool {
svc := key.(string)
gen := val.(*SpecGen)
spec, err := gen.GetSpec()
@@ -123,20 +135,18 @@ func TestEntries(t *testing.T) {
}
func TestFileSingle(t *testing.T) {
GetDefaultOasGeneratorInstance().Start()
GetDefaultOasGeneratorInstance().Reset()
gen := NewDefaultOasGenerator()
gen.serviceSpecs = new(sync.Map)
// loadStartingOAS()
file := "test_artifacts/params.har"
files := []string{file}
cnt, err := feedEntries(files, true)
cnt, err := feedEntries(files, true, gen)
if err != nil {
logger.Log.Warning("Failed processing file: " + err.Error())
t.Fail()
}
waitQueueProcessed()
GetDefaultOasGeneratorInstance().GetServiceSpecs().Range(func(key, val interface{}) bool {
gen.serviceSpecs.Range(func(key, val interface{}) bool {
svc := key.(string)
gen := val.(*SpecGen)
spec, err := gen.GetSpec()
@@ -189,18 +199,7 @@ func TestFileSingle(t *testing.T) {
logger.Log.Infof("Processed entries: %d", cnt)
}
func waitQueueProcessed() {
for {
time.Sleep(100 * time.Millisecond)
queue := len(GetDefaultOasGeneratorInstance().entriesChan)
logger.Log.Infof("Queue: %d", queue)
if queue < 1 {
break
}
}
}
func loadStartingOAS(file string, label string) {
func loadStartingOAS(file string, label string, specs *sync.Map) {
fd, err := os.Open(file)
if err != nil {
panic(err)
@@ -222,12 +221,14 @@ func loadStartingOAS(file string, label string) {
gen := NewGen(label)
gen.StartFromSpec(doc)
GetDefaultOasGeneratorInstance().GetServiceSpecs().Store(label, gen)
specs.Store(label, gen)
}
func TestEntriesNegative(t *testing.T) {
gen := NewDefaultOasGenerator()
gen.serviceSpecs = new(sync.Map)
files := []string{"invalid"}
_, err := feedEntries(files, false)
_, err := feedEntries(files, false, gen)
if err == nil {
t.Logf("Should have failed")
t.Fail()
@@ -235,8 +236,10 @@ func TestEntriesNegative(t *testing.T) {
}
func TestEntriesPositive(t *testing.T) {
gen := NewDefaultOasGenerator()
gen.serviceSpecs = new(sync.Map)
files := []string{"test_artifacts/params.har"}
_, err := feedEntries(files, false)
_, err := feedEntries(files, false, gen)
if err != nil {
t.Logf("Failed")
t.Fail()
@@ -275,3 +278,17 @@ func TestLoadValid3_1(t *testing.T) {
t.FailNow()
}
}
type FakeConn struct {
sendBuffer *bytes.Buffer
receiveBuffer *bytes.Buffer
}
func (f FakeConn) Read(p []byte) (int, error) { return f.sendBuffer.Read(p) }
func (f FakeConn) Write(p []byte) (int, error) { return f.receiveBuffer.Write(p) }
func (FakeConn) Close() error { return nil }
func (FakeConn) LocalAddr() net.Addr { return nil }
func (FakeConn) RemoteAddr() net.Addr { return nil }
func (FakeConn) SetDeadline(t time.Time) error { return nil }
func (FakeConn) SetReadDeadline(t time.Time) error { return nil }
func (FakeConn) SetWriteDeadline(t time.Time) error { return nil }

View File

@@ -21,9 +21,11 @@
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null
"example": null,
"x-sample-entry": "000000000000000000000004"
}
}
},
"x-sample-entry": "000000000000000000000004"
}
},
"x-counters-per-source": {
@@ -45,7 +47,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750580.04,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000004"
}
},
"/appears-twice": {
@@ -58,9 +60,11 @@
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null
"example": null,
"x-sample-entry": "000000000000000000000006"
}
}
},
"x-sample-entry": "000000000000000000000006"
}
},
"x-counters-per-source": {
@@ -82,7 +86,7 @@
"sumDuration": 1
},
"x-last-seen-ts": 1567750581.74,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000006"
}
},
"/body-optional": {
@@ -94,8 +98,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000012"
}
},
"x-sample-entry": "000000000000000000000012"
}
},
"x-counters-per-source": {
@@ -117,14 +124,16 @@
"sumDuration": 0.01
},
"x-last-seen-ts": 1567750581.75,
"x-sample-entry": 0,
"x-sample-entry": "000000000000000000000012",
"requestBody": {
"description": "Generic request body",
"content": {
"application/json": {
"example": "{\"key\", \"val\"}"
"example": "{\"key\", \"val\"}",
"x-sample-entry": "000000000000000000000011"
}
}
},
"x-sample-entry": "000000000000000000000012"
}
}
},
@@ -137,8 +146,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000013"
}
},
"x-sample-entry": "000000000000000000000013"
}
},
"x-counters-per-source": {
@@ -160,15 +172,17 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750581.75,
"x-sample-entry": 0,
"x-sample-entry": "000000000000000000000013",
"requestBody": {
"description": "Generic request body",
"content": {
"": {
"example": "body exists"
"example": "body exists",
"x-sample-entry": "000000000000000000000013"
}
},
"required": true
"required": true,
"x-sample-entry": "000000000000000000000013"
}
}
},
@@ -182,9 +196,11 @@
"description": "Successful call with status 200",
"content": {
"": {
"example": {}
"example": {},
"x-sample-entry": "000000000000000000000009"
}
}
},
"x-sample-entry": "000000000000000000000009"
}
},
"x-counters-per-source": {
@@ -206,7 +222,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.74,
"x-sample-entry": 0,
"x-sample-entry": "000000000000000000000009",
"requestBody": {
"description": "Generic request body",
"content": {
@@ -233,10 +249,12 @@
}
}
},
"example": "--BOUNDARY\r\nContent-Disposition: form-data; name=\"file\"; filename=\"metadata.json\"\r\nContent-Type: application/json\r\n\r\n{\"functions\": 123}\r\n--BOUNDARY\r\nContent-Disposition: form-data; name=\"path\"\r\n\r\n/content/components\r\n--BOUNDARY--\r\n"
"example": "--BOUNDARY\r\nContent-Disposition: form-data; name=\"file\"; filename=\"metadata.json\"\r\nContent-Type: application/json\r\n\r\n{\"functions\": 123}\r\n--BOUNDARY\r\nContent-Disposition: form-data; name=\"path\"\r\n\r\n/content/components\r\n--BOUNDARY--\r\n",
"x-sample-entry": "000000000000000000000009"
}
},
"required": true
"required": true,
"x-sample-entry": "000000000000000000000009"
}
}
},
@@ -249,8 +267,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000008"
}
},
"x-sample-entry": "000000000000000000000008"
}
},
"x-counters-per-source": {
@@ -272,7 +293,7 @@
"sumDuration": 1
},
"x-last-seen-ts": 1567750581.74,
"x-sample-entry": 0,
"x-sample-entry": "000000000000000000000008",
"requestBody": {
"description": "Generic request body",
"content": {
@@ -312,10 +333,12 @@
}
}
},
"example": "agent-id=ade\u0026callback-url=\u0026token=sometoken"
"example": "agent-id=ade\u0026callback-url=\u0026token=sometoken",
"x-sample-entry": "000000000000000000000008"
}
},
"required": true
"required": true,
"x-sample-entry": "000000000000000000000008"
}
}
},
@@ -331,8 +354,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000014"
}
},
"x-sample-entry": "000000000000000000000014"
}
},
"x-counters-per-source": {
@@ -354,7 +380,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750582,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000014"
},
"parameters": [
{
@@ -369,7 +395,8 @@
"example #0": {
"value": "234324"
}
}
},
"x-sample-entry": "000000000000000000000014"
}
]
},
@@ -385,8 +412,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000018"
}
},
"x-sample-entry": "000000000000000000000018"
}
},
"x-counters-per-source": {
@@ -408,7 +438,7 @@
"sumDuration": 9.53e-7
},
"x-last-seen-ts": 1567750582.00,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000018"
},
"parameters": [
{
@@ -436,7 +466,8 @@
"example #4": {
"value": "prefix-gibberish-afterwards"
}
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
@@ -452,8 +483,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000015"
}
},
"x-sample-entry": "000000000000000000000015"
}
},
"x-counters-per-source": {
@@ -475,7 +509,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.00,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000015"
},
"parameters": [
{
@@ -503,7 +537,8 @@
"example #4": {
"value": "prefix-gibberish-afterwards"
}
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
@@ -519,8 +554,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000016"
}
},
"x-sample-entry": "000000000000000000000016"
}
},
"x-counters-per-source": {
@@ -542,7 +580,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.00,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000016"
},
"parameters": [
{
@@ -570,7 +608,8 @@
"example #4": {
"value": "prefix-gibberish-afterwards"
}
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
@@ -586,8 +625,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"": {}
}
"": {
"x-sample-entry": "000000000000000000000019"
}
},
"x-sample-entry": "000000000000000000000019"
}
},
"x-counters-per-source": {
@@ -609,7 +651,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750582.00,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000019"
},
"parameters": [
{
@@ -624,7 +666,8 @@
"example #0": {
"value": "23421"
}
}
},
"x-sample-entry": "000000000000000000000019"
},
{
"name": "parampatternId",
@@ -651,7 +694,8 @@
"example #4": {
"value": "prefix-gibberish-afterwards"
}
}
},
"x-sample-entry": "000000000000000000000019"
}
]
},
@@ -665,9 +709,11 @@
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null
"example": null,
"x-sample-entry": "000000000000000000000003"
}
}
},
"x-sample-entry": "000000000000000000000003"
}
},
"x-counters-per-source": {
@@ -689,7 +735,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750579.74,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000003"
},
"parameters": [
{
@@ -707,7 +753,8 @@
"example #1": {
"value": "<UUID4>"
}
}
},
"x-sample-entry": "000000000000000000000003"
}
]
},
@@ -720,8 +767,11 @@
"200": {
"description": "Successful call with status 200",
"content": {
"text/html": {}
}
"text/html": {
"x-sample-entry": "000000000000000000000001"
}
},
"x-sample-entry": "000000000000000000000001"
}
},
"x-counters-per-source": {
@@ -743,7 +793,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750483.86,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000001"
},
"parameters": [
{
@@ -761,7 +811,8 @@
"example #1": {
"value": "<UUID4>"
}
}
},
"x-sample-entry": "000000000000000000000003"
}
]
},
@@ -775,9 +826,11 @@
"description": "Successful call with status 200",
"content": {
"application/json": {
"example": null
"example": null,
"x-sample-entry": "000000000000000000000002"
}
}
},
"x-sample-entry": "000000000000000000000002"
}
},
"x-counters-per-source": {
@@ -799,7 +852,7 @@
"sumDuration": 0
},
"x-last-seen-ts": 1567750578.74,
"x-sample-entry": 0
"x-sample-entry": "000000000000000000000002"
},
"parameters": [
{
@@ -817,7 +870,8 @@
"example #1": {
"value": "<UUID4>"
}
}
},
"x-sample-entry": "000000000000000000000003"
}
]
}

View File

@@ -2,12 +2,13 @@ package oas
import (
"encoding/json"
"github.com/chanced/openapi"
"github.com/up9inc/mizu/shared/logger"
"net/url"
"regexp"
"strconv"
"strings"
"github.com/chanced/openapi"
"github.com/up9inc/mizu/shared/logger"
)
type NodePath = []string
@@ -20,7 +21,7 @@ type Node struct {
children []*Node
}
func (n *Node) getOrSet(path NodePath, existingPathObj *openapi.PathObj) (node *Node) {
func (n *Node) getOrSet(path NodePath, existingPathObj *openapi.PathObj, sampleId string) (node *Node) {
if existingPathObj == nil {
panic("Invalid function call")
}
@@ -70,6 +71,10 @@ func (n *Node) getOrSet(path NodePath, existingPathObj *openapi.PathObj) (node *
}
}
if node.pathParam != nil {
setSampleID(&node.pathParam.Extensions, sampleId)
}
// add example if it's a gibberish chunk
if node.pathParam != nil && !chunkIsParam {
exmp := &node.pathParam.Examples
@@ -85,7 +90,7 @@ func (n *Node) getOrSet(path NodePath, existingPathObj *openapi.PathObj) (node *
// TODO: eat up trailing slash, in a smart way: node.pathObj!=nil && path[1]==""
if len(path) > 1 {
return node.getOrSet(path[1:], existingPathObj)
return node.getOrSet(path[1:], existingPathObj, sampleId)
} else if node.pathObj == nil {
node.pathObj = existingPathObj
}

View File

@@ -1,6 +1,7 @@
package oas
import (
"fmt"
"strings"
"testing"
@@ -20,10 +21,10 @@ func TestTree(t *testing.T) {
}
tree := new(Node)
for _, tc := range testCases {
for i, tc := range testCases {
split := strings.Split(tc.inp, "/")
pathObj := new(openapi.PathObj)
node := tree.getOrSet(split, pathObj)
node := tree.getOrSet(split, pathObj, fmt.Sprintf("%024d", i))
fillPathParams(node, pathObj)

View File

@@ -115,7 +115,7 @@ type nvParams struct {
GeneralizeName func(name string) string
}
func handleNameVals(gw nvParams, params **openapi.ParameterList, checkIgnore bool) {
func handleNameVals(gw nvParams, params **openapi.ParameterList, checkIgnore bool, sampleId string) {
visited := map[string]*openapi.ParameterObj{}
for _, pair := range gw.Pairs {
if (checkIgnore && gw.IsIgnored(pair.Name)) || pair.Name == "" {
@@ -137,6 +137,8 @@ func handleNameVals(gw nvParams, params **openapi.ParameterList, checkIgnore boo
logger.Log.Warningf("Failed to add example to a parameter: %s", err)
}
visited[nameGeneral] = param
setSampleID(&param.Extensions, sampleId)
}
// maintain "required" flag
@@ -474,3 +476,15 @@ func intersectSliceWithMap(required []string, names map[string]struct{}) []strin
}
return required
}
func setSampleID(extensions *openapi.Extensions, id string) {
if id != "" {
if *extensions == nil {
*extensions = openapi.Extensions{}
}
err := (extensions).SetExtension(SampleId, id)
if err != nil {
logger.Log.Warningf("Failed to set sample ID: %s", err)
}
}
}

View File

@@ -29,7 +29,7 @@ func TestAnyJSON(t *testing.T) {
} else if tc.inp == "null" && any != nil {
t.Errorf("null has to parse as nil (but got %s)", any)
} else {
t.Logf("%s => %s", tc.inp, any)
t.Logf("%s => %v", tc.inp, any)
}
}
}

View File

@@ -18,10 +18,11 @@ type ServiceMapResponse struct {
}
type ServiceMapNode struct {
Id int `json:"id"`
Name string `json:"name"`
Entry *tapApi.TCP `json:"entry"`
Count int `json:"count"`
Id int `json:"id"`
Name string `json:"name"`
Entry *tapApi.TCP `json:"entry"`
Count int `json:"count"`
Resolved bool `json:"resolved"`
}
type ServiceMapEdge struct {

View File

@@ -1,6 +1,7 @@
package servicemap
import (
"github.com/jinzhu/copier"
"sync"
"github.com/up9inc/mizu/shared/logger"
@@ -183,8 +184,12 @@ func (s *defaultServiceMap) NewTCPEntry(src *tapApi.TCP, dst *tapApi.TCP, p *tap
if len(src.Name) == 0 {
srcEntry = &entryData{
key: key(src.IP),
entry: src,
entry: &tapApi.TCP{},
}
if err := copier.Copy(srcEntry.entry, src); err != nil {
logger.Log.Errorf("Error while copying src entry into src entry data")
}
srcEntry.entry.Name = UnresolvedNodeName
} else {
srcEntry = &entryData{
@@ -196,8 +201,12 @@ func (s *defaultServiceMap) NewTCPEntry(src *tapApi.TCP, dst *tapApi.TCP, p *tap
if len(dst.Name) == 0 {
dstEntry = &entryData{
key: key(dst.IP),
entry: dst,
entry: &tapApi.TCP{},
}
if err := copier.Copy(dstEntry.entry, dst); err != nil {
logger.Log.Errorf("Error while copying dst entry into dst entry data")
}
dstEntry.entry.Name = UnresolvedNodeName
} else {
dstEntry = &entryData{
@@ -224,35 +233,41 @@ func (s *defaultServiceMap) GetStatus() ServiceMapStatus {
}
func (s *defaultServiceMap) GetNodes() []ServiceMapNode {
var nodes []ServiceMapNode
nodes := []ServiceMapNode{}
for i, n := range s.graph.Nodes {
nodes = append(nodes, ServiceMapNode{
Id: n.id,
Name: string(i),
Entry: n.entry,
Count: n.count,
Id: n.id,
Name: string(i),
Resolved: n.entry.Name != UnresolvedNodeName,
Entry: n.entry,
Count: n.count,
})
}
return nodes
}
func (s *defaultServiceMap) GetEdges() []ServiceMapEdge {
var edges []ServiceMapEdge
edges := []ServiceMapEdge{}
for u, m := range s.graph.Edges {
for v := range m {
for _, p := range s.graph.Edges[u][v].data {
edges = append(edges, ServiceMapEdge{
Source: ServiceMapNode{
Id: s.graph.Nodes[u].id,
Name: string(u),
Entry: s.graph.Nodes[u].entry,
Count: s.graph.Nodes[u].count,
Id: s.graph.Nodes[u].id,
Name: string(u),
Entry: s.graph.Nodes[u].entry,
Resolved: s.graph.Nodes[u].entry.Name != UnresolvedNodeName,
Count: s.graph.Nodes[u].count,
},
Destination: ServiceMapNode{
Id: s.graph.Nodes[v].id,
Name: string(v),
Entry: s.graph.Nodes[v].entry,
Count: s.graph.Nodes[v].count,
Id: s.graph.Nodes[v].id,
Name: string(v),
Entry: s.graph.Nodes[v].entry,
Resolved: s.graph.Nodes[v].entry.Name != UnresolvedNodeName,
Count: s.graph.Nodes[v].count,
},
Count: p.count,
Protocol: p.protocol,
@@ -260,6 +275,7 @@ func (s *defaultServiceMap) GetEdges() []ServiceMapEdge {
}
}
}
return edges
}

View File

@@ -403,10 +403,10 @@ func (s *ServiceMapEnabledSuite) TestServiceMap() {
assert.Equal(0, status.EdgeCount)
// Nodes after reset
assert.Equal([]ServiceMapNode(nil), nodes)
assert.Equal([]ServiceMapNode{}, nodes)
// Edges after reset
assert.Equal([]ServiceMapEdge(nil), edges)
assert.Equal([]ServiceMapEdge{}, edges)
}
func TestServiceMapSuite(t *testing.T) {

View File

@@ -323,7 +323,9 @@ func syncEntriesImpl(token string, model string, envPrefix string, uploadInterva
go handleMetaChannel(&wg, connection, meta)
wg.Add(2)
connection.Query(query, data, meta)
if err = connection.Query(query, data, meta); err != nil {
logger.Log.Panicf("Query mode call failed: %v", err)
}
wg.Wait()
}

View File

@@ -11,7 +11,7 @@ require (
github.com/op/go-logging v0.0.0-20160315200505-970db520ece7
github.com/spf13/cobra v1.3.0
github.com/spf13/pflag v1.0.5
github.com/up9inc/basenine/server/lib v0.0.0-20220326121918-785f3061c8ce
github.com/up9inc/basenine/server/lib v0.0.0-20220413023528-c741e4aa1cf2
github.com/up9inc/mizu/shared v0.0.0
github.com/up9inc/mizu/tap/api v0.0.0
golang.org/x/oauth2 v0.0.0-20211104180415-d3ed0bb246c8
@@ -72,7 +72,7 @@ require (
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/monochromegane/go-gitignore v0.0.0-20200626010858-205db1a8cc00 // indirect
github.com/mxk/go-flowrate v0.0.0-20140419014527-cca7078d478f // indirect
github.com/ohler55/ojg v1.12.13 // indirect
github.com/ohler55/ojg v1.14.0 // indirect
github.com/peterbourgon/diskv v2.0.1+incompatible // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect

View File

@@ -487,8 +487,8 @@ github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWb
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
github.com/nxadm/tail v1.4.4 h1:DQuhQpB1tVlglWS2hLQ5OV6B5r8aGxSrPc5Qo6uTN78=
github.com/nxadm/tail v1.4.4/go.mod h1:kenIhsEOeOJmVchQTgglprH7qJGnHDVpk1VPCcaMI8A=
github.com/ohler55/ojg v1.12.13 h1:FvfVpYzLgMraLcg3rrXiRXaihOP6fnzQNEU9YyZ/AmM=
github.com/ohler55/ojg v1.12.13/go.mod h1:LBbIVRAgoFbYBXQhRhuEpaJIqq+goSO63/FQ+nyJU88=
github.com/ohler55/ojg v1.14.0 h1:DyHomsCwofNswmKj7BLMdx51xnKbXxgIo1rVWCaBcNk=
github.com/ohler55/ojg v1.14.0/go.mod h1:3+GH+0PggMKocQtbZCrFifal3yRpHiBT4QUkxFJI6e8=
github.com/oklog/ulid v1.3.1/go.mod h1:CirwcVhetQ6Lv90oh/F+FBtV6XMibvdAFo93nm5qn4U=
github.com/olekukonko/tablewriter v0.0.4/go.mod h1:zq6QwlOf5SlnkVbMSr5EoBv3636FWnp+qbPhuoO21uA=
github.com/onsi/ginkgo v0.0.0-20170829012221-11459a886d9c/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
@@ -600,8 +600,8 @@ github.com/subosito/gotenv v1.2.0/go.mod h1:N0PQaV/YGNqwC0u51sEeR/aUtSLEXKX9iv69
github.com/tmc/grpc-websocket-proxy v0.0.0-20190109142713-0ad062ec5ee5/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U=
github.com/tv42/httpunix v0.0.0-20150427012821-b75d8614f926/go.mod h1:9ESjWnEqriFuLhtthL60Sar/7RFoluCcXsuvEwTV5KM=
github.com/ugorji/go v1.1.4/go.mod h1:uQMGLiO92mf5W77hV/PUCpI3pbzQx3CRekS0kk+RGrc=
github.com/up9inc/basenine/server/lib v0.0.0-20220326121918-785f3061c8ce h1:PypqybjmuxftGkX4NmP4JAUyEykZj2r6W4r9lnRZ/kE=
github.com/up9inc/basenine/server/lib v0.0.0-20220326121918-785f3061c8ce/go.mod h1:ZIkxWiJm65jYQIso9k+OZKhR7gQ1we2jNyE2kQX9IQI=
github.com/up9inc/basenine/server/lib v0.0.0-20220413023528-c741e4aa1cf2 h1:rgm5a2ALbYKbItaSXx25K8vavZpeF0HN1Pk0qmOqy50=
github.com/up9inc/basenine/server/lib v0.0.0-20220413023528-c741e4aa1cf2/go.mod h1:v0hIh31iwDGbkkdeSSppdMNm1oIigfCA2mG2XajKnf8=
github.com/xiang90/probing v0.0.0-20190116061207-43a291ad63a2/go.mod h1:UETIi67q53MR2AWcXfiuqkDkRtnGDLqkBTpCHuJHxtU=
github.com/xlab/treeprint v0.0.0-20181112141820-a009c3971eca/go.mod h1:ce1O1j6UtZfjr22oyGxGLbauSBp2YVXpARAosm7dHBg=
github.com/xlab/treeprint v1.1.0 h1:G/1DjNkPpfZCFt9CSh6b5/nY4VimlbHF3Rh4obvtzDk=

View File

@@ -156,12 +156,12 @@ func (e *Emitting) Emit(item *OutputChannelItem) {
}
type Entry struct {
Id uint `json:"id"`
Id string `json:"id"`
Protocol Protocol `json:"proto"`
Capture Capture `json:"capture"`
Source *TCP `json:"src"`
Destination *TCP `json:"dst"`
Namespace string `json:"namespace,omitempty"`
Namespace string `json:"namespace"`
Outgoing bool `json:"outgoing"`
Timestamp int64 `json:"timestamp"`
StartTime time.Time `json:"startTime"`
@@ -188,7 +188,7 @@ type EntryWrapper struct {
}
type BaseEntry struct {
Id uint `json:"id"`
Id string `json:"id"`
Protocol Protocol `json:"proto,omitempty"`
Capture Capture `json:"capture"`
Summary string `json:"summary,omitempty"`

View File

@@ -13,4 +13,4 @@ test-pull-bin:
test-pull-expect:
@mkdir -p expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect5/amqp/\* expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect7/amqp/\* expect

View File

@@ -13,4 +13,4 @@ test-pull-bin:
test-pull-expect:
@mkdir -p expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect5/http/\* expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect7/http/\* expect

View File

@@ -53,7 +53,16 @@ func representMapSliceAsTable(mapSlice []interface{}, selectorPrefix string) (re
h := item.(map[string]interface{})
key := h["name"].(string)
value := h["value"]
switch reflect.TypeOf(value).Kind() {
var reflectKind reflect.Kind
reflectType := reflect.TypeOf(value)
if reflectType == nil {
reflectKind = reflect.Interface
} else {
reflectKind = reflect.TypeOf(value).Kind()
}
switch reflectKind {
case reflect.Slice:
fallthrough
case reflect.Array:

View File

@@ -28,26 +28,6 @@ const protoMinorHTTP2 = 0
var maxHTTP2DataLen = 1 * 1024 * 1024 // 1MB
var grpcStatusCodes = []string{
"OK",
"CANCELLED",
"UNKNOWN",
"INVALID_ARGUMENT",
"DEADLINE_EXCEEDED",
"NOT_FOUND",
"ALREADY_EXISTS",
"PERMISSION_DENIED",
"RESOURCE_EXHAUSTED",
"FAILED_PRECONDITION",
"ABORTED",
"OUT_OF_RANGE",
"UNIMPLEMENTED",
"INTERNAL",
"UNAVAILABLE",
"DATA_LOSS",
"UNAUTHENTICATED",
}
type messageFragment struct {
headers []hpack.HeaderField
data []byte
@@ -142,18 +122,8 @@ func (ga *Http2Assembler) readMessage() (streamID uint32, messageHTTP1 interface
// gRPC detection
grpcStatus := headersHTTP1.Get("Grpc-Status")
if grpcStatus != "" {
if grpcStatus != "" || strings.Contains(headersHTTP1.Get("Content-Type"), "application/grpc") {
isGrpc = true
status = grpcStatus
}
if strings.Contains(headersHTTP1.Get("Content-Type"), "application/grpc") {
isGrpc = true
grpcPath := headersHTTP1.Get(":path")
pathSegments := strings.Split(grpcPath, "/")
if len(pathSegments) > 0 {
method = pathSegments[len(pathSegments)-1]
}
}
if method != "" {

View File

@@ -248,11 +248,6 @@ func (d dissecting) Analyze(item *api.OutputChannelItem, resolvedSource string,
reqDetails["_queryStringMerged"] = mapSliceMergeRepeatedKeys(reqDetails["_queryString"].([]interface{}))
reqDetails["queryString"] = mapSliceRebuildAsMap(reqDetails["_queryStringMerged"].([]interface{}))
statusCode := int(resDetails["status"].(float64))
if item.Protocol.Abbreviation == "gRPC" {
resDetails["statusText"] = grpcStatusCodes[statusCode]
}
elapsedTime := item.Pair.Response.CaptureTime.Sub(item.Pair.Request.CaptureTime).Round(time.Millisecond).Milliseconds()
if elapsedTime < 0 {
elapsedTime = 0

View File

@@ -13,4 +13,4 @@ test-pull-bin:
test-pull-expect:
@mkdir -p expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect5/kafka/\* expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect7/kafka/\* expect

View File

@@ -8,6 +8,7 @@ require (
github.com/segmentio/kafka-go v0.4.27
github.com/stretchr/testify v1.6.1
github.com/up9inc/mizu/tap/api v0.0.0
golang.org/x/text v0.3.0
)
require (

View File

@@ -40,6 +40,7 @@ golang.org/x/crypto v0.0.0-20190506204251-e1dfcc566284/go.mod h1:yigFU9vqHzYiE8U
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/text v0.3.0 h1:g61tztE5qeGQ89tm6NTjjM9VPIm088od1l6aSorWRWg=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543 h1:E7g+9GITq07hpfrRu66IVDexMakfv52eLZ2CXBWiKr4=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=

View File

@@ -3,6 +3,8 @@ package kafka
import (
"encoding/json"
"fmt"
"golang.org/x/text/cases"
"golang.org/x/text/language"
"reflect"
"sort"
"strconv"
@@ -891,8 +893,9 @@ func representMapAsTable(mapData map[string]interface{}, selectorPrefix string,
}
}
selector := fmt.Sprintf("%s[\"%s\"]", selectorPrefix, key)
caser := cases.Title(language.Und, cases.NoLower)
table = append(table, api.TableData{
Name: strings.Join(camelcase.Split(strings.Title(key)), " "),
Name: strings.Join(camelcase.Split(caser.String(key)), " "),
Value: value,
Selector: selector,
})

View File

@@ -13,4 +13,4 @@ test-pull-bin:
test-pull-expect:
@mkdir -p expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect5/redis/\* expect
@[ "${skipexpect}" ] && echo "Skipping downloading expected JSONs" || gsutil -o 'GSUtil:parallel_process_count=5' -o 'GSUtil:parallel_thread_count=5' -m cp -r gs://static.up9.io/mizu/test-pcap/expect7/redis/\* expect

View File

@@ -112,7 +112,7 @@ func UpdateTapTargets(newTapTargets []v1.Pod) {
tapTargets = newTapTargets
packetSourceManager.UpdatePods(tapTargets)
packetSourceManager.UpdatePods(tapTargets, !*nodefrag, mainPacketInputChan)
if tlsTapperInstance != nil {
if err := tlstapper.UpdateTapTargets(tlsTapperInstance, &tapTargets, *procfs); err != nil {
@@ -198,12 +198,8 @@ func initializePacketSources() error {
}
var err error
if packetSourceManager, err = source.NewPacketSourceManager(*procfs, *fname, *iface, *servicemesh, tapTargets, behaviour); err != nil {
return err
} else {
packetSourceManager.ReadPackets(!*nodefrag, mainPacketInputChan)
return nil
}
packetSourceManager, err = source.NewPacketSourceManager(*procfs, *fname, *iface, *servicemesh, tapTargets, behaviour, !*nodefrag, mainPacketInputChan)
return err
}
func initializePassiveTapper(opts *TapOpts, outputItems chan *api.OutputChannelItem) (*tcpStreamMap, *tcpAssembler) {
@@ -259,9 +255,10 @@ func startPassiveTapper(streamsMap *tcpStreamMap, assembler *tcpAssembler) {
func startTlsTapper(extension *api.Extension, outputItems chan *api.OutputChannelItem, options *api.TrafficFilteringOptions) *tlstapper.TlsTapper {
tls := tlstapper.TlsTapper{}
tlsPerfBufferSize := os.Getpagesize() * 100
chunksBufferSize := os.Getpagesize() * 100
logBufferSize := os.Getpagesize()
if err := tls.Init(tlsPerfBufferSize, *procfs, extension); err != nil {
if err := tls.Init(chunksBufferSize, logBufferSize, *procfs, extension); err != nil {
tlstapper.LogError(err)
return nil
}
@@ -285,6 +282,7 @@ func startTlsTapper(extension *api.Extension, outputItems chan *api.OutputChanne
OutputChannel: outputItems,
}
go tls.PollForLogging()
go tls.Poll(emitter, options)
return &tls

View File

@@ -5,11 +5,12 @@ import (
"runtime"
"github.com/up9inc/mizu/shared/logger"
"github.com/up9inc/mizu/tap/api"
"github.com/vishvananda/netns"
)
func newNetnsPacketSource(procfs string, pid string,
interfaceName string, behaviour TcpPacketSourceBehaviour) (*tcpPacketSource, error) {
func newNetnsPacketSource(procfs string, pid string, interfaceName string,
behaviour TcpPacketSourceBehaviour, origin api.Capture) (*tcpPacketSource, error) {
nsh, err := netns.GetFromPath(fmt.Sprintf("%s/%s/ns/net", procfs, pid))
if err != nil {
@@ -17,7 +18,7 @@ func newNetnsPacketSource(procfs string, pid string,
return nil, err
}
src, err := newPacketSourceFromNetnsHandle(pid, nsh, interfaceName, behaviour)
src, err := newPacketSourceFromNetnsHandle(pid, nsh, interfaceName, behaviour, origin)
if err != nil {
logger.Log.Errorf("Error starting netns packet source for %s - %w", pid, err)
@@ -28,7 +29,7 @@ func newNetnsPacketSource(procfs string, pid string,
}
func newPacketSourceFromNetnsHandle(pid string, nsh netns.NsHandle, interfaceName string,
behaviour TcpPacketSourceBehaviour) (*tcpPacketSource, error) {
behaviour TcpPacketSourceBehaviour, origin api.Capture) (*tcpPacketSource, error) {
done := make(chan *tcpPacketSource)
errors := make(chan error)
@@ -57,7 +58,7 @@ func newPacketSourceFromNetnsHandle(pid string, nsh netns.NsHandle, interfaceNam
}
name := fmt.Sprintf("netns-%s-%s", pid, interfaceName)
src, err := newTcpPacketSource(name, "", interfaceName, behaviour)
src, err := newTcpPacketSource(name, "", interfaceName, behaviour, origin)
if err != nil {
logger.Log.Errorf("Error listening to PID %s - %w", pid, err)

View File

@@ -5,6 +5,7 @@ import (
"strings"
"github.com/up9inc/mizu/shared/logger"
"github.com/up9inc/mizu/tap/api"
v1 "k8s.io/api/core/v1"
)
@@ -24,7 +25,7 @@ type PacketSourceManager struct {
}
func NewPacketSourceManager(procfs string, filename string, interfaceName string,
mtls bool, pods []v1.Pod, behaviour TcpPacketSourceBehaviour) (*PacketSourceManager, error) {
mtls bool, pods []v1.Pod, behaviour TcpPacketSourceBehaviour, ipdefrag bool, packets chan<- TcpPacketInfo) (*PacketSourceManager, error) {
hostSource, err := newHostPacketSource(filename, interfaceName, behaviour)
if err != nil {
return nil, err
@@ -37,13 +38,13 @@ func NewPacketSourceManager(procfs string, filename string, interfaceName string
}
sourceManager.config = PacketSourceManagerConfig{
mtls: mtls,
procfs: procfs,
mtls: mtls,
procfs: procfs,
interfaceName: interfaceName,
behaviour: behaviour,
behaviour: behaviour,
}
sourceManager.UpdatePods(pods)
go hostSource.readPackets(ipdefrag, packets)
return sourceManager, nil
}
@@ -56,7 +57,7 @@ func newHostPacketSource(filename string, interfaceName string,
name = fmt.Sprintf("file-%s", filename)
}
source, err := newTcpPacketSource(name, filename, interfaceName, behaviour)
source, err := newTcpPacketSource(name, filename, interfaceName, behaviour, api.Pcap)
if err != nil {
return nil, err
}
@@ -64,16 +65,16 @@ func newHostPacketSource(filename string, interfaceName string,
return source, nil
}
func (m *PacketSourceManager) UpdatePods(pods []v1.Pod) {
func (m *PacketSourceManager) UpdatePods(pods []v1.Pod, ipdefrag bool, packets chan<- TcpPacketInfo) {
if m.config.mtls {
m.updateMtlsPods(m.config.procfs, pods, m.config.interfaceName, m.config.behaviour)
m.updateMtlsPods(m.config.procfs, pods, m.config.interfaceName, m.config.behaviour, ipdefrag, packets)
}
m.setBPFFilter(pods)
}
func (m *PacketSourceManager) updateMtlsPods(procfs string, pods []v1.Pod,
interfaceName string, behaviour TcpPacketSourceBehaviour) {
interfaceName string, behaviour TcpPacketSourceBehaviour, ipdefrag bool, packets chan<- TcpPacketInfo) {
relevantPids := m.getRelevantPids(procfs, pods)
logger.Log.Infof("Updating mtls pods (new: %v) (current: %v)", relevantPids, m.sources)
@@ -85,26 +86,27 @@ func (m *PacketSourceManager) updateMtlsPods(procfs string, pods []v1.Pod,
}
}
for pid := range relevantPids {
for pid, origin := range relevantPids {
if _, ok := m.sources[pid]; !ok {
source, err := newNetnsPacketSource(procfs, pid, interfaceName, behaviour)
source, err := newNetnsPacketSource(procfs, pid, interfaceName, behaviour, origin)
if err == nil {
go source.readPackets(ipdefrag, packets)
m.sources[pid] = source
}
}
}
}
func (m *PacketSourceManager) getRelevantPids(procfs string, pods []v1.Pod) map[string]bool {
relevantPids := make(map[string]bool)
relevantPids[hostSourcePid] = true
func (m *PacketSourceManager) getRelevantPids(procfs string, pods []v1.Pod) map[string]api.Capture {
relevantPids := make(map[string]api.Capture)
relevantPids[hostSourcePid] = api.Pcap
if envoyPids, err := discoverRelevantEnvoyPids(procfs, pods); err != nil {
logger.Log.Warningf("Unable to discover envoy pids - %w", err)
} else {
for _, pid := range envoyPids {
relevantPids[pid] = true
relevantPids[pid] = api.Envoy
}
}
@@ -112,7 +114,7 @@ func (m *PacketSourceManager) getRelevantPids(procfs string, pods []v1.Pod) map[
logger.Log.Warningf("Unable to discover linkerd pids - %w", err)
} else {
for _, pid := range linkerdPids {
relevantPids[pid] = true
relevantPids[pid] = api.Linkerd
}
}
@@ -153,12 +155,6 @@ func (m *PacketSourceManager) setBPFFilter(pods []v1.Pod) {
}
}
func (m *PacketSourceManager) ReadPackets(ipdefrag bool, packets chan<- TcpPacketInfo) {
for _, src := range m.sources {
go src.readPackets(ipdefrag, packets)
}
}
func (m *PacketSourceManager) Close() {
for _, src := range m.sources {
src.close()

View File

@@ -10,6 +10,7 @@ import (
"github.com/google/gopacket/layers"
"github.com/google/gopacket/pcap"
"github.com/up9inc/mizu/shared/logger"
"github.com/up9inc/mizu/tap/api"
"github.com/up9inc/mizu/tap/diagnose"
)
@@ -19,6 +20,7 @@ type tcpPacketSource struct {
defragger *ip4defrag.IPv4Defragmenter
Behaviour *TcpPacketSourceBehaviour
name string
Origin api.Capture
}
type TcpPacketSourceBehaviour struct {
@@ -36,13 +38,14 @@ type TcpPacketInfo struct {
}
func newTcpPacketSource(name, filename string, interfaceName string,
behaviour TcpPacketSourceBehaviour) (*tcpPacketSource, error) {
behaviour TcpPacketSourceBehaviour, origin api.Capture) (*tcpPacketSource, error) {
var err error
result := &tcpPacketSource{
name: name,
defragger: ip4defrag.NewIPv4Defragmenter(),
Behaviour: &behaviour,
Origin: origin,
}
if filename != "" {

View File

@@ -29,6 +29,7 @@ type tcpAssembler struct {
// The assembler context
type context struct {
CaptureInfo gopacket.CaptureInfo
Origin api.Capture
}
func (c *context) GetCaptureInfo() gopacket.CaptureInfo {
@@ -87,8 +88,10 @@ func (a *tcpAssembler) processPackets(dumpPacket bool, packets <-chan source.Tcp
logger.Log.Fatalf("Failed to set network layer for checksum: %s", err)
}
}
c := context{
CaptureInfo: packet.Metadata().CaptureInfo,
Origin: packetInfo.Source.Origin,
}
diagnose.InternalStats.Totalsz += len(tcp.Payload)
a.assemblerMutex.Lock()

View File

@@ -98,8 +98,7 @@ func (h *tcpReader) Close() {
func (h *tcpReader) run(wg *sync.WaitGroup) {
defer wg.Done()
b := bufio.NewReader(h)
// TODO: Add api.Pcap, api.Envoy and api.Linkerd distinction by refactoring NewPacketSourceManager method
err := h.extension.Dissector.Dissect(b, h.progress, api.Pcap, h.isClient, h.tcpID, h.counterPair, h.superTimer, h.parent.superIdentifier, h.emitter, filteringOptions, h.reqResMatcher)
err := h.extension.Dissector.Dissect(b, h.progress, h.parent.origin, h.isClient, h.tcpID, h.counterPair, h.superTimer, h.parent.superIdentifier, h.emitter, filteringOptions, h.reqResMatcher)
if err != nil {
_, err = io.Copy(ioutil.Discard, b)
if err != nil {

View File

@@ -29,6 +29,7 @@ type tcpStream struct {
clients []tcpReader
servers []tcpReader
ident string
origin api.Capture
sync.Mutex
streamsMap *tcpStreamMap
}
@@ -70,6 +71,9 @@ func (t *tcpStream) Accept(tcp *layers.TCP, ci gopacket.CaptureInfo, dir reassem
if !accept {
diagnose.InternalStats.RejectOpt++
}
*start = true
return accept
}

View File

@@ -78,6 +78,7 @@ func (factory *tcpStreamFactory) New(net, transport gopacket.Flow, tcp *layers.T
optchecker: reassembly.NewTCPOptionCheck(),
superIdentifier: &api.SuperIdentifier{},
streamsMap: factory.streamsMap,
origin: getPacketOrigin(ac),
}
if stream.isTapTarget {
stream.id = factory.streamsMap.nextId()
@@ -182,6 +183,17 @@ func (factory *tcpStreamFactory) shouldNotifyOnOutboundLink(dstIP string, dstPor
return true
}
func getPacketOrigin(ac reassembly.AssemblerContext) api.Capture {
c, ok := ac.(*context)
if !ok {
// If ac is not our context, fallback to Pcap
return api.Pcap
}
return c.Origin
}
type streamProps struct {
isTapTarget bool
isOutgoing bool

View File

@@ -7,8 +7,12 @@ Copyright (C) UP9 Inc.
#include "include/headers.h"
#include "include/util.h"
#include "include/maps.h"
#include "include/log.h"
#include "include/logger_messages.h"
#include "include/pids.h"
#define IPV4_ADDR_LEN (16)
struct accept_info {
__u64* sockaddr;
__u32* addrlen;
@@ -41,9 +45,7 @@ void sys_enter_accept4(struct sys_enter_accept4_ctx *ctx) {
long err = bpf_map_update_elem(&accept_syscall_context, &id, &info, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting accept info (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_ACCEPT_INFO, id, err, 0l);
}
}
@@ -70,6 +72,7 @@ void sys_exit_accept4(struct sys_exit_accept4_ctx *ctx) {
struct accept_info *infoPtr = bpf_map_lookup_elem(&accept_syscall_context, &id);
if (infoPtr == NULL) {
log_error(ctx, LOG_ERROR_GETTING_ACCEPT_INFO, id, 0l, 0l);
return;
}
@@ -79,15 +82,14 @@ void sys_exit_accept4(struct sys_exit_accept4_ctx *ctx) {
bpf_map_delete_elem(&accept_syscall_context, &id);
if (err != 0) {
char msg[] = "Error reading accept info from accept syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_ACCEPT_INFO, id, err, 0l);
return;
}
__u32 addrlen;
bpf_probe_read(&addrlen, sizeof(__u32), info.addrlen);
if (addrlen != 16) {
if (addrlen != IPV4_ADDR_LEN) {
// Currently only ipv4 is supported linux-src/include/linux/inet.h
return;
}
@@ -105,9 +107,7 @@ void sys_exit_accept4(struct sys_exit_accept4_ctx *ctx) {
err = bpf_map_update_elem(&file_descriptor_to_ipv4, &key, &fdinfo, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting fd to address mapping from accept (key: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), key, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_FD_MAPPING, id, err, ORIGIN_SYS_EXIT_ACCEPT4_CODE);
}
}
@@ -145,9 +145,7 @@ void sys_enter_connect(struct sys_enter_connect_ctx *ctx) {
long err = bpf_map_update_elem(&connect_syscall_info, &id, &info, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting connect info (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_CONNECT_INFO, id, err, 0l);
}
}
@@ -176,6 +174,7 @@ void sys_exit_connect(struct sys_exit_connect_ctx *ctx) {
struct connect_info *infoPtr = bpf_map_lookup_elem(&connect_syscall_info, &id);
if (infoPtr == NULL) {
log_error(ctx, LOG_ERROR_GETTING_CONNECT_INFO, id, 0l, 0l);
return;
}
@@ -185,12 +184,11 @@ void sys_exit_connect(struct sys_exit_connect_ctx *ctx) {
bpf_map_delete_elem(&connect_syscall_info, &id);
if (err != 0) {
char msg[] = "Error reading connect info from connect syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_CONNECT_INFO, id, err, 0l);
return;
}
if (info.addrlen != 16) {
if (info.addrlen != IPV4_ADDR_LEN) {
// Currently only ipv4 is supported linux-src/include/linux/inet.h
return;
}
@@ -208,8 +206,6 @@ void sys_exit_connect(struct sys_exit_connect_ctx *ctx) {
err = bpf_map_update_elem(&file_descriptor_to_ipv4, &key, &fdinfo, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting fd to address mapping from connect (key: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), key, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_FD_MAPPING, id, err, ORIGIN_SYS_EXIT_CONNECT_CODE);
}
}

View File

@@ -7,6 +7,8 @@ Copyright (C) UP9 Inc.
#include "include/headers.h"
#include "include/util.h"
#include "include/maps.h"
#include "include/log.h"
#include "include/logger_messages.h"
#include "include/pids.h"
struct sys_enter_read_ctx {
@@ -36,8 +38,7 @@ void sys_enter_read(struct sys_enter_read_ctx *ctx) {
long err = bpf_probe_read(&info, sizeof(struct ssl_info), infoPtr);
if (err != 0) {
char msg[] = "Error reading read info from read syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_SSL_CONTEXT, id, err, ORIGIN_SYS_ENTER_READ_CODE);
return;
}
@@ -46,9 +47,7 @@ void sys_enter_read(struct sys_enter_read_ctx *ctx) {
err = bpf_map_update_elem(&ssl_read_context, &id, &info, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting file descriptor from read syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_FILE_DESCRIPTOR, id, err, ORIGIN_SYS_ENTER_READ_CODE);
}
}
@@ -79,8 +78,7 @@ void sys_enter_write(struct sys_enter_write_ctx *ctx) {
long err = bpf_probe_read(&info, sizeof(struct ssl_info), infoPtr);
if (err != 0) {
char msg[] = "Error reading write context from write syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_SSL_CONTEXT, id, err, ORIGIN_SYS_ENTER_WRITE_CODE);
return;
}
@@ -89,8 +87,6 @@ void sys_enter_write(struct sys_enter_write_ctx *ctx) {
err = bpf_map_update_elem(&ssl_write_context, &id, &info, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting file descriptor from write syscall (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
return;
log_error(ctx, LOG_ERROR_PUTTING_FILE_DESCRIPTOR, id, err, ORIGIN_SYS_ENTER_WRITE_CODE);
}
}

View File

@@ -0,0 +1,79 @@
/*
Note: This file is licenced differently from the rest of the project
SPDX-License-Identifier: GPL-2.0
Copyright (C) UP9 Inc.
*/
#ifndef __LOG__
#define __LOG__
// The same consts defined in bpf_logger.go
//
#define LOG_LEVEL_ERROR (0)
#define LOG_LEVEL_INFO (1)
#define LOG_LEVEL_DEBUG (2)
// The same struct can be found in bpf_logger.go
//
// Be careful when editing, alignment and padding should be exactly the same in go/c.
//
struct log_message {
__u32 level;
__u32 message_code;
__u64 arg1;
__u64 arg2;
__u64 arg3;
};
static __always_inline void log_error(void* ctx, __u16 message_code, __u64 arg1, __u64 arg2, __u64 arg3) {
struct log_message entry = {};
entry.level = LOG_LEVEL_ERROR;
entry.message_code = message_code;
entry.arg1 = arg1;
entry.arg2 = arg2;
entry.arg3 = arg3;
long err = bpf_perf_event_output(ctx, &log_buffer, BPF_F_CURRENT_CPU, &entry, sizeof(struct log_message));
if (err != 0) {
char msg[] = "Error writing log error to perf buffer - %ld";
bpf_trace_printk(msg, sizeof(msg), err);
}
}
static __always_inline void log_info(void* ctx, __u16 message_code, __u64 arg1, __u64 arg2, __u64 arg3) {
struct log_message entry = {};
entry.level = LOG_LEVEL_INFO;
entry.message_code = message_code;
entry.arg1 = arg1;
entry.arg2 = arg2;
entry.arg3 = arg3;
long err = bpf_perf_event_output(ctx, &log_buffer, BPF_F_CURRENT_CPU, &entry, sizeof(struct log_message));
if (err != 0) {
char msg[] = "Error writing log info to perf buffer - %ld";
bpf_trace_printk(msg, sizeof(msg), arg1, err);
}
}
static __always_inline void log_debug(void* ctx, __u16 message_code, __u64 arg1, __u64 arg2, __u64 arg3) {
struct log_message entry = {};
entry.level = LOG_LEVEL_DEBUG;
entry.message_code = message_code;
entry.arg1 = arg1;
entry.arg2 = arg2;
entry.arg3 = arg3;
long err = bpf_perf_event_output(ctx, &log_buffer, BPF_F_CURRENT_CPU, &entry, sizeof(struct log_message));
if (err != 0) {
char msg[] = "Error writing log debug to perf buffer - %ld";
bpf_trace_printk(msg, sizeof(msg), arg1, err);
}
}
#endif /* __LOG__ */

View File

@@ -0,0 +1,42 @@
/*
Note: This file is licenced differently from the rest of the project
SPDX-License-Identifier: GPL-2.0
Copyright (C) UP9 Inc.
*/
#ifndef __LOG_MESSAGES__
#define __LOG_MESSAGES__
// Must be synced with bpf_logger_messages.go
//
#define LOG_ERROR_READING_BYTES_COUNT (0)
#define LOG_ERROR_READING_FD_ADDRESS (1)
#define LOG_ERROR_READING_FROM_SSL_BUFFER (2)
#define LOG_ERROR_BUFFER_TOO_BIG (3)
#define LOG_ERROR_ALLOCATING_CHUNK (4)
#define LOG_ERROR_READING_SSL_CONTEXT (5)
#define LOG_ERROR_PUTTING_SSL_CONTEXT (6)
#define LOG_ERROR_GETTING_SSL_CONTEXT (7)
#define LOG_ERROR_MISSING_FILE_DESCRIPTOR (8)
#define LOG_ERROR_PUTTING_FILE_DESCRIPTOR (9)
#define LOG_ERROR_PUTTING_ACCEPT_INFO (10)
#define LOG_ERROR_GETTING_ACCEPT_INFO (11)
#define LOG_ERROR_READING_ACCEPT_INFO (12)
#define LOG_ERROR_PUTTING_FD_MAPPING (13)
#define LOG_ERROR_PUTTING_CONNECT_INFO (14)
#define LOG_ERROR_GETTING_CONNECT_INFO (15)
#define LOG_ERROR_READING_CONNECT_INFO (16)
// Sometimes we have the same error, happening from different locations.
// in order to be able to distinct between them in the log, we add an
// extra number that identify the location. The number can be anything,
// but do not give the same number to different origins.
//
#define ORIGIN_SSL_UPROBE_CODE (0l)
#define ORIGIN_SSL_URETPROBE_CODE (1l)
#define ORIGIN_SYS_ENTER_READ_CODE (2l)
#define ORIGIN_SYS_ENTER_WRITE_CODE (3l)
#define ORIGIN_SYS_EXIT_ACCEPT4_CODE (4l)
#define ORIGIN_SYS_EXIT_CONNECT_CODE (5l)
#endif /* __LOG_MESSAGES__ */

View File

@@ -70,5 +70,6 @@ BPF_LRU_HASH(ssl_write_context, __u64, struct ssl_info);
BPF_LRU_HASH(ssl_read_context, __u64, struct ssl_info);
BPF_HASH(file_descriptor_to_ipv4, __u64, struct fd_info);
BPF_PERF_OUTPUT(chunks_buffer);
BPF_PERF_OUTPUT(log_buffer);
#endif /* __MAPS__ */

View File

@@ -7,6 +7,8 @@ Copyright (C) UP9 Inc.
#include "include/headers.h"
#include "include/util.h"
#include "include/maps.h"
#include "include/log.h"
#include "include/logger_messages.h"
#include "include/pids.h"
// Heap-like area for eBPF programs - stack size limited to 512 bytes, we must use maps for bigger (chunk) objects.
@@ -39,15 +41,14 @@ static __always_inline int get_count_bytes(struct pt_regs *ctx, struct ssl_info*
long err = bpf_probe_read(&countBytes, sizeof(size_t), (void*) info->count_ptr);
if (err != 0) {
char msg[] = "Error reading bytes count of _ex (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_BYTES_COUNT, id, err, 0l);
return 0;
}
return countBytes;
}
static __always_inline void add_address_to_chunk(struct tlsChunk* chunk, __u64 id, __u32 fd) {
static __always_inline void add_address_to_chunk(struct pt_regs *ctx, struct tlsChunk* chunk, __u64 id, __u32 fd) {
__u32 pid = id >> 32;
__u64 key = (__u64) pid << 32 | fd;
@@ -61,8 +62,7 @@ static __always_inline void add_address_to_chunk(struct tlsChunk* chunk, __u64 i
chunk->flags |= (fdinfo->flags & FLAGS_IS_CLIENT_BIT);
if (err != 0) {
char msg[] = "Error reading from fd address %ld - %ld";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_FD_ADDRESS, id, err, 0l);
}
}
@@ -88,8 +88,7 @@ static __always_inline void send_chunk_part(struct pt_regs *ctx, __u8* buffer, _
}
if (err != 0) {
char msg[] = "Error reading from ssl buffer %ld - %ld";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_FROM_SSL_BUFFER, id, err, 0l);
return;
}
@@ -101,8 +100,9 @@ static __always_inline void send_chunk(struct pt_regs *ctx, __u8* buffer, __u64
//
// https://lwn.net/Articles/794934/
//
// If we want to compile in kernel older than 5.3, we should add "#pragma unroll" to this loop
// However we want to run in kernel older than 5.3, hence we use "#pragma unroll" anyway
//
#pragma unroll
for (int i = 0; i < MAX_CHUNKS_PER_OPERATION; i++) {
if (chunk->len <= (CHUNK_SIZE * i)) {
break;
@@ -120,8 +120,7 @@ static __always_inline void output_ssl_chunk(struct pt_regs *ctx, struct ssl_inf
}
if (countBytes > (CHUNK_SIZE * MAX_CHUNKS_PER_OPERATION)) {
char msg[] = "Buffer too big %d (id: %ld)";
bpf_trace_printk(msg, sizeof(msg), countBytes, id);
log_error(ctx, LOG_ERROR_BUFFER_TOO_BIG, id, countBytes, 0l);
return;
}
@@ -134,8 +133,7 @@ static __always_inline void output_ssl_chunk(struct pt_regs *ctx, struct ssl_inf
chunk = bpf_map_lookup_elem(&heap, &zero);
if (!chunk) {
char msg[] = "Unable to allocate chunk (id: %ld)";
bpf_trace_printk(msg, sizeof(msg), id);
log_error(ctx, LOG_ERROR_ALLOCATING_CHUNK, id, 0l, 0l);
return;
}
@@ -145,11 +143,11 @@ static __always_inline void output_ssl_chunk(struct pt_regs *ctx, struct ssl_inf
chunk->len = countBytes;
chunk->fd = info->fd;
add_address_to_chunk(chunk, id, chunk->fd);
add_address_to_chunk(ctx, chunk, id, chunk->fd);
send_chunk(ctx, info->buffer, id, chunk);
}
static __always_inline void ssl_uprobe(void* ssl, void* buffer, int num, struct bpf_map_def* map_fd, size_t *count_ptr) {
static __always_inline void ssl_uprobe(struct pt_regs *ctx, void* ssl, void* buffer, int num, struct bpf_map_def* map_fd, size_t *count_ptr) {
__u64 id = bpf_get_current_pid_tgid();
if (!should_tap(id >> 32)) {
@@ -166,8 +164,7 @@ static __always_inline void ssl_uprobe(void* ssl, void* buffer, int num, struct
long err = bpf_probe_read(&info, sizeof(struct ssl_info), infoPtr);
if (err != 0) {
char msg[] = "Error reading old ssl context (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_SSL_CONTEXT, id, err, ORIGIN_SSL_UPROBE_CODE);
}
if ((bpf_ktime_get_ns() - info.created_at_nano) > SSL_INFO_MAX_TTL_NANO) {
@@ -184,8 +181,7 @@ static __always_inline void ssl_uprobe(void* ssl, void* buffer, int num, struct
long err = bpf_map_update_elem(map_fd, &id, &info, BPF_ANY);
if (err != 0) {
char msg[] = "Error putting ssl context (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_PUTTING_SSL_CONTEXT, id, err, 0l);
}
}
@@ -199,8 +195,7 @@ static __always_inline void ssl_uretprobe(struct pt_regs *ctx, struct bpf_map_de
struct ssl_info *infoPtr = bpf_map_lookup_elem(map_fd, &id);
if (infoPtr == NULL) {
char msg[] = "Error getting ssl context info (id: %ld)";
bpf_trace_printk(msg, sizeof(msg), id);
log_error(ctx, LOG_ERROR_GETTING_SSL_CONTEXT, id, 0l, 0l);
return;
}
@@ -220,14 +215,12 @@ static __always_inline void ssl_uretprobe(struct pt_regs *ctx, struct bpf_map_de
// bpf_map_delete_elem(map_fd, &id);
if (err != 0) {
char msg[] = "Error reading ssl context (id: %ld) (err: %ld)";
bpf_trace_printk(msg, sizeof(msg), id, err);
log_error(ctx, LOG_ERROR_READING_SSL_CONTEXT, id, err, ORIGIN_SSL_URETPROBE_CODE);
return;
}
if (info.fd == -1) {
char msg[] = "File descriptor is missing from ssl info (id: %ld)";
bpf_trace_printk(msg, sizeof(msg), id);
log_error(ctx, LOG_ERROR_MISSING_FILE_DESCRIPTOR, id, 0l, 0l);
return;
}
@@ -236,7 +229,7 @@ static __always_inline void ssl_uretprobe(struct pt_regs *ctx, struct bpf_map_de
SEC("uprobe/ssl_write")
void BPF_KPROBE(ssl_write, void* ssl, void* buffer, int num) {
ssl_uprobe(ssl, buffer, num, &ssl_write_context, 0);
ssl_uprobe(ctx, ssl, buffer, num, &ssl_write_context, 0);
}
SEC("uretprobe/ssl_write")
@@ -246,7 +239,7 @@ void BPF_KPROBE(ssl_ret_write) {
SEC("uprobe/ssl_read")
void BPF_KPROBE(ssl_read, void* ssl, void* buffer, int num) {
ssl_uprobe(ssl, buffer, num, &ssl_read_context, 0);
ssl_uprobe(ctx, ssl, buffer, num, &ssl_read_context, 0);
}
SEC("uretprobe/ssl_read")
@@ -256,7 +249,7 @@ void BPF_KPROBE(ssl_ret_read) {
SEC("uprobe/ssl_write_ex")
void BPF_KPROBE(ssl_write_ex, void* ssl, void* buffer, size_t num, size_t *written) {
ssl_uprobe(ssl, buffer, num, &ssl_write_context, written);
ssl_uprobe(ctx, ssl, buffer, num, &ssl_write_context, written);
}
SEC("uretprobe/ssl_write_ex")
@@ -266,7 +259,7 @@ void BPF_KPROBE(ssl_ret_write_ex) {
SEC("uprobe/ssl_read_ex")
void BPF_KPROBE(ssl_read_ex, void* ssl, void* buffer, size_t num, size_t *readbytes) {
ssl_uprobe(ssl, buffer, num, &ssl_read_context, readbytes);
ssl_uprobe(ctx, ssl, buffer, num, &ssl_read_context, readbytes);
}
SEC("uretprobe/ssl_read_ex")

View File

@@ -7,6 +7,8 @@ Copyright (C) UP9 Inc.
#include "include/headers.h"
#include "include/util.h"
#include "include/maps.h"
#include "include/log.h"
#include "include/logger_messages.h"
#include "include/pids.h"
// To avoid multiple .o files

116
tap/tlstapper/bpf_logger.go Normal file
View File

@@ -0,0 +1,116 @@
package tlstapper
import (
"bytes"
"encoding/binary"
"strings"
"github.com/cilium/ebpf/perf"
"github.com/go-errors/errors"
"github.com/up9inc/mizu/shared/logger"
)
const logPrefix = "[bpf] "
// The same consts defined in log.h
//
const logLevelError = 0
const logLevelInfo = 1
const logLevelDebug = 2
type logMessage struct {
Level uint32
MessageCode uint32
Arg1 uint64
Arg2 uint64
Arg3 uint64
}
type bpfLogger struct {
logReader *perf.Reader
}
func newBpfLogger() *bpfLogger {
return &bpfLogger{
logReader: nil,
}
}
func (p *bpfLogger) init(bpfObjects *tlsTapperObjects, bufferSize int) error {
var err error
p.logReader, err = perf.NewReader(bpfObjects.LogBuffer, bufferSize)
if err != nil {
return errors.Wrap(err, 0)
}
return nil
}
func (p *bpfLogger) close() error {
return p.logReader.Close()
}
func (p *bpfLogger) poll() {
logger.Log.Infof("Start polling for bpf logs")
for {
record, err := p.logReader.Read()
if err != nil {
if errors.Is(err, perf.ErrClosed) {
return
}
LogError(errors.Errorf("Error reading from bpf logger perf buffer, aboring logger! %w", err))
return
}
if record.LostSamples != 0 {
logger.Log.Infof("Log buffer is full, dropped %d logs", record.LostSamples)
continue
}
buffer := bytes.NewReader(record.RawSample)
var log logMessage
if err := binary.Read(buffer, binary.LittleEndian, &log); err != nil {
LogError(errors.Errorf("Error parsing log %v", err))
continue
}
p.log(&log)
}
}
func (p *bpfLogger) log(log *logMessage) {
if int(log.MessageCode) >= len(bpfLogMessages) {
logger.Log.Errorf("Unknown message code from bpf logger %d", log.MessageCode)
return
}
format := bpfLogMessages[log.MessageCode]
tokensCount := strings.Count(format, "%")
if tokensCount == 0 {
p.logLevel(log.Level, format)
} else if tokensCount == 1 {
p.logLevel(log.Level, format, log.Arg1)
} else if tokensCount == 2 {
p.logLevel(log.Level, format, log.Arg1, log.Arg2)
} else if tokensCount == 3 {
p.logLevel(log.Level, format, log.Arg1, log.Arg2, log.Arg3)
}
}
func (p *bpfLogger) logLevel(level uint32, format string, args ...interface{}) {
if level == logLevelError {
logger.Log.Errorf(logPrefix+format, args...)
} else if level == logLevelInfo {
logger.Log.Infof(logPrefix+format, args...)
} else if level == logLevelDebug {
logger.Log.Debugf(logPrefix+format, args...)
}
}

View File

@@ -0,0 +1,25 @@
package tlstapper
// Must be synced with logger_messages.h
//
var bpfLogMessages = []string {
/*0000*/ "[%d] Unable to read bytes count from _ex methods [err: %d]",
/*0001*/ "[%d] Unable to read ipv4 address [err: %d]",
/*0002*/ "[%d] Unable to read ssl buffer [err: %d]",
/*0003*/ "[%d] Buffer is too big [size: %d]",
/*0004*/ "[%d] Unable to allocate chunk in bpf heap",
/*0005*/ "[%d] Unable to read ssl context [err: %d] [origin: %d]",
/*0006*/ "[%d] Unable to put ssl context [err: %d]",
/*0007*/ "[%d] Unable to get ssl context",
/*0008*/ "[%d] File descriptor is missing for tls chunk",
/*0009*/ "[%d] Unable to put file descriptor [err: %d] [origin: %d]",
/*0010*/ "[%d] Unable to put accept info [err: %d]",
/*0011*/ "[%d] Unable to get accept info",
/*0012*/ "[%d] Unable to read accept info [err: %d]",
/*0013*/ "[%d] Unable to put file descriptor to address mapping [err: %d] [origin: %d]",
/*0014*/ "[%d] Unable to put connect info [err: %d]",
/*0015*/ "[%d] Unable to get connect info",
/*0016*/ "[%d] Unable to read connect info [err: %d]",
}

View File

@@ -8,6 +8,8 @@ import (
"sync"
)
const GLOABL_TAP_PID = 0
//go:generate go run github.com/cilium/ebpf/cmd/bpf2go tlsTapper bpf/tls_tapper.c -- -O2 -g -D__TARGET_ARCH_x86
type TlsTapper struct {
@@ -15,11 +17,12 @@ type TlsTapper struct {
syscallHooks syscallHooks
sslHooksStructs []sslHooks
poller *tlsPoller
bpfLogger *bpfLogger
registeredPids sync.Map
}
func (t *TlsTapper) Init(bufferSize int, procfs string, extension *api.Extension) error {
logger.Log.Infof("Initializing tls tapper (bufferSize: %v)", bufferSize)
func (t *TlsTapper) Init(chunksBufferSize int, logBufferSize int, procfs string, extension *api.Extension) error {
logger.Log.Infof("Initializing tls tapper (chunksSize: %d) (logSize: %d)", chunksBufferSize, logBufferSize)
if err := setupRLimit(); err != nil {
return err
@@ -37,16 +40,25 @@ func (t *TlsTapper) Init(bufferSize int, procfs string, extension *api.Extension
t.sslHooksStructs = make([]sslHooks, 0)
t.bpfLogger = newBpfLogger()
if err := t.bpfLogger.init(&t.bpfObjects, logBufferSize); err != nil {
return err
}
t.poller = newTlsPoller(t, extension, procfs)
return t.poller.init(&t.bpfObjects, bufferSize)
return t.poller.init(&t.bpfObjects, chunksBufferSize)
}
func (t *TlsTapper) Poll(emitter api.Emitter, options *api.TrafficFilteringOptions) {
t.poller.poll(emitter, options)
}
func (t *TlsTapper) PollForLogging() {
t.bpfLogger.poll()
}
func (t *TlsTapper) GlobalTap(sslLibrary string) error {
return t.tapPid(0, sslLibrary)
return t.tapPid(GLOABL_TAP_PID, sslLibrary)
}
func (t *TlsTapper) AddPid(procfs string, pid uint32) error {
@@ -74,7 +86,12 @@ func (t *TlsTapper) RemovePid(pid uint32) error {
func (t *TlsTapper) ClearPids() {
t.registeredPids.Range(func(key, v interface{}) bool {
if err := t.RemovePid(key.(uint32)); err != nil {
pid := key.(uint32)
if pid == GLOABL_TAP_PID {
return true
}
if err := t.RemovePid(pid); err != nil {
LogError(err)
}
t.registeredPids.Delete(key)
@@ -95,6 +112,10 @@ func (t *TlsTapper) Close() []error {
errors = append(errors, sslHooks.close()...)
}
if err := t.bpfLogger.close(); err != nil {
errors = append(errors, err)
}
if err := t.poller.close(); err != nil {
errors = append(errors, err)
}

View File

@@ -78,6 +78,7 @@ type tlsTapperMapSpecs struct {
ConnectSyscallInfo *ebpf.MapSpec `ebpf:"connect_syscall_info"`
FileDescriptorToIpv4 *ebpf.MapSpec `ebpf:"file_descriptor_to_ipv4"`
Heap *ebpf.MapSpec `ebpf:"heap"`
LogBuffer *ebpf.MapSpec `ebpf:"log_buffer"`
PidsMap *ebpf.MapSpec `ebpf:"pids_map"`
SslReadContext *ebpf.MapSpec `ebpf:"ssl_read_context"`
SslWriteContext *ebpf.MapSpec `ebpf:"ssl_write_context"`
@@ -107,6 +108,7 @@ type tlsTapperMaps struct {
ConnectSyscallInfo *ebpf.Map `ebpf:"connect_syscall_info"`
FileDescriptorToIpv4 *ebpf.Map `ebpf:"file_descriptor_to_ipv4"`
Heap *ebpf.Map `ebpf:"heap"`
LogBuffer *ebpf.Map `ebpf:"log_buffer"`
PidsMap *ebpf.Map `ebpf:"pids_map"`
SslReadContext *ebpf.Map `ebpf:"ssl_read_context"`
SslWriteContext *ebpf.Map `ebpf:"ssl_write_context"`
@@ -119,6 +121,7 @@ func (m *tlsTapperMaps) Close() error {
m.ConnectSyscallInfo,
m.FileDescriptorToIpv4,
m.Heap,
m.LogBuffer,
m.PidsMap,
m.SslReadContext,
m.SslWriteContext,

Binary file not shown.

View File

@@ -78,6 +78,7 @@ type tlsTapperMapSpecs struct {
ConnectSyscallInfo *ebpf.MapSpec `ebpf:"connect_syscall_info"`
FileDescriptorToIpv4 *ebpf.MapSpec `ebpf:"file_descriptor_to_ipv4"`
Heap *ebpf.MapSpec `ebpf:"heap"`
LogBuffer *ebpf.MapSpec `ebpf:"log_buffer"`
PidsMap *ebpf.MapSpec `ebpf:"pids_map"`
SslReadContext *ebpf.MapSpec `ebpf:"ssl_read_context"`
SslWriteContext *ebpf.MapSpec `ebpf:"ssl_write_context"`
@@ -107,6 +108,7 @@ type tlsTapperMaps struct {
ConnectSyscallInfo *ebpf.Map `ebpf:"connect_syscall_info"`
FileDescriptorToIpv4 *ebpf.Map `ebpf:"file_descriptor_to_ipv4"`
Heap *ebpf.Map `ebpf:"heap"`
LogBuffer *ebpf.Map `ebpf:"log_buffer"`
PidsMap *ebpf.Map `ebpf:"pids_map"`
SslReadContext *ebpf.Map `ebpf:"ssl_read_context"`
SslWriteContext *ebpf.Map `ebpf:"ssl_write_context"`
@@ -119,6 +121,7 @@ func (m *tlsTapperMaps) Close() error {
m.ConnectSyscallInfo,
m.FileDescriptorToIpv4,
m.Heap,
m.LogBuffer,
m.PidsMap,
m.SslReadContext,
m.SslWriteContext,

Binary file not shown.

View File

@@ -1,12 +1,12 @@
{
"name": "@up9/mizu-common",
"version": "1.0.135",
"version": "0.0.0",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "@up9/mizu-common",
"version": "1.0.135",
"version": "0.0.0",
"license": "MIT",
"dependencies": {
"@craco/craco": "^6.4.3",

View File

@@ -1,6 +1,6 @@
{
"name": "@up9/mizu-common",
"version": "1.0.144",
"version": "0.0.0",
"description": "Made with create-react-library",
"author": "",
"license": "MIT",
@@ -26,15 +26,16 @@
"@material-ui/core": "^4.11.3",
"@material-ui/icons": "^4.11.2",
"@material-ui/lab": "^4.0.0-alpha.60",
"@types/jest": "^26.0.22",
"@types/node": "^12.20.10",
"node-sass": "^6.0.0",
"react": "^17.0.2",
"react-dom": "^17.0.2",
"recoil": "^0.5.2",
"react-copy-to-clipboard": "^5.0.3",
"@types/jest": "^26.0.22",
"@types/node": "^12.20.10"
"react-dom": "^17.0.2",
"recoil": "^0.5.2"
},
"dependencies": {
"@craco/craco": "^6.4.3",
"@types/lodash": "^4.14.179",
"@uiw/react-textarea-code-editor": "^1.4.12",
"axios": "^0.25.0",
@@ -58,8 +59,7 @@
"redoc": "^2.0.0-rc.59",
"styled-components": "^5.3.3",
"web-vitals": "^1.1.1",
"xml-formatter": "^2.6.0",
"@craco/craco": "^6.4.3"
"xml-formatter": "^2.6.0"
},
"devDependencies": {
"@rollup/plugin-node-resolve": "^13.1.3",

View File

@@ -8,6 +8,7 @@ import openApiLogo from 'assets/openApiLogo.png'
import { redocThemeOptions } from "./redocThemeOptions";
import React from "react";
import { Select } from "../UI/Select";
import { TOAST_CONTAINER_ID } from "../../configs/Consts";
const modalStyle = {
@@ -42,10 +43,10 @@ const OasModal = ({ openModal, handleCloseModal, getOasServices, getOasByService
try {
const data = await getOasByService(selectedService ? selectedService : oasServices[0]);
setSelectedServiceSpec(data);
} catch (e) {
toast.error("Error occurred while fetching service OAS spec");
console.error(e);
}
} catch (e) {
toast.error("Error occurred while fetching service OAS spec", { containerId: TOAST_CONTAINER_ID });
console.error(e);
}
};
useEffect(() => {
@@ -61,7 +62,7 @@ const OasModal = ({ openModal, handleCloseModal, getOasServices, getOasByService
useEffect(() => {
onSelectedOASService(null);
},[oasServices])
}, [oasServices])
return (
<Modal
@@ -80,28 +81,28 @@ const OasModal = ({ openModal, handleCloseModal, getOasServices, getOasByService
<div className={style.boxContainer}>
<div className={style.selectHeader}>
<div><img src={openApiLogo} alt="openAPI" className={style.openApilogo} /></div>
<div className={style.title}>OpenAPI</div>
<div className={style.title}>Service Catalog</div>
</div>
<div style={{ cursor: "pointer" }}>
<img src={closeIcon} alt="close" onClick={handleCloseModal} />
</div>
</div>
<div className={style.selectContainer} >
<FormControl>
<Select
labelId="service-select-label"
id="service-select"
value={selectedServiceName}
onChangeCb={onSelectedOASService}
>
{oasServices.map((service) => (
<MenuItem key={service} value={service}>
{service}
</MenuItem>
))}
</Select>
</FormControl>
</div>
<FormControl>
<Select
labelId="service-select-label"
id="service-select"
value={selectedServiceName}
onChangeCb={onSelectedOASService}
>
{oasServices.map((service) => (
<MenuItem key={service} value={service}>
{service}
</MenuItem>
))}
</Select>
</FormControl>
</div>
<div className={style.borderLine}></div>
<div className={style.redoc}>
{selectedServiceSpec && <RedocStandalone

View File

@@ -0,0 +1,58 @@
@import "../../variables.module"
.modalContainer
display: flex
width: 100%
height: 100%
.graphSection
flex: 85%
.filterSection
flex: 15%
height: 100%
.filters table
margin-top: 0px
tr
border-style: none
td
color: #8f9bb2
font-size: 11px
font-weight: 600
padding-top: 2px
padding-bottom: 2px
.colorBlock
display: inline-block
height: 15px
width: 50px
.filterWrapper
height: 100%
display: flex
flex-direction: column
margin-right: 10px
width: 100%
.servicesFilterSearch
width: calc(100% - 10px)
max-width: 300px
box-shadow: 0px 1px 5px #979797
margin-left: 10px
margin-bottom: 5px
.servicesFilter
margin-top: 15px
height: 100%
overflow: hidden
& .servicesFilterList
overflow-y: auto
height: calc(100% - 30px - 5px)
.separtorLine
margin-top: 10px
border: 1px solid #E9EBF8

View File

@@ -0,0 +1,225 @@
import React, { useState, useEffect, useCallback, useMemo } from "react";
import { Box, Fade, Modal, Backdrop, Button } from "@material-ui/core";
import { toast } from "react-toastify";
import spinnerStyle from '../UI/style/Spinner.module.sass';
import spinnerImg from 'assets/spinner.svg';
import Graph from "react-graph-vis";
import debounce from 'lodash/debounce';
import ServiceMapOptions from './ServiceMapOptions'
import { useCommonStyles } from "../../helpers/commonStyle";
import refreshIcon from "assets/refresh.svg";
import closeIcon from "assets/close.svg"
import styles from './ServiceMapModal.module.sass'
import SelectList from "../UI/SelectList";
import { GraphData, ServiceMapGraph } from "./ServiceMapModalTypes"
import { Utils } from "../../helpers/Utils";
import { TOAST_CONTAINER_ID } from "../../configs/Consts";
import Resizeable from "../UI/Resizeable"
const modalStyle = {
position: 'absolute',
top: '6%',
left: '50%',
transform: 'translate(-50%, 0%)',
width: '89vw',
height: '82vh',
bgcolor: 'background.paper',
borderRadius: '5px',
boxShadow: 24,
p: 4,
color: '#000',
padding: "25px 15px"
};
interface LegentLabelProps {
color: string,
name: string
}
const LegentLabel: React.FC<LegentLabelProps> = ({ color, name }) => {
return <React.Fragment>
<div style={{ display: "flex", justifyContent: "space-between", alignItems: "center" }}>
<span title={name}>{name}</span>
<span style={{ background: color }} className={styles.colorBlock}></span>
</div>
</React.Fragment>
}
const protocols = [
{ key: "HTTP", value: "HTTP", component: <LegentLabel color="#205cf5" name="HTTP" /> },
{ key: "HTTP/2", value: "HTTP/2", component: <LegentLabel color='#244c5a' name="HTTP/2" /> },
{ key: "gRPC", value: "gRPC", component: <LegentLabel color='#244c5a' name="gRPC" /> },
{ key: "AMQP", value: "AMQP", component: <LegentLabel color='#ff6600' name="AMQP" /> },
{ key: "KAFKA", value: "KAFKA", component: <LegentLabel color='#000000' name="KAFKA" /> },
{ key: "REDIS", value: "REDIS", component: <LegentLabel color='#a41e11' name="REDIS" /> },]
interface ServiceMapModalProps {
isOpen: boolean;
onOpen: () => void;
onClose: () => void;
getServiceMapDataApi: () => Promise<any>
}
export const ServiceMapModal: React.FC<ServiceMapModalProps> = ({ isOpen, onClose, getServiceMapDataApi }) => {
const commonClasses = useCommonStyles();
const [isLoading, setIsLoading] = useState<boolean>(true);
const [graphData, setGraphData] = useState<GraphData>({ nodes: [], edges: [] });
const [checkedProtocols, setCheckedProtocols] = useState(protocols.map(x => x.key))
const [checkedServices, setCheckedServices] = useState([])
const [serviceMapApiData, setServiceMapApiData] = useState<ServiceMapGraph>({ edges: [], nodes: [] })
const [servicesSearchVal, setServicesSearchVal] = useState("")
const [graphOptions, setGraphOptions] = useState(ServiceMapOptions);
const getServiceMapData = useCallback(async () => {
try {
setIsLoading(true)
const serviceMapData: ServiceMapGraph = await getServiceMapDataApi()
setServiceMapApiData(serviceMapData)
const newGraphData: GraphData = { nodes: [], edges: [] }
if (serviceMapData.nodes) {
newGraphData.nodes = serviceMapData.nodes.map(mapNodesDatatoGraph)
}
if (serviceMapData.edges) {
newGraphData.edges = serviceMapData.edges.map(mapEdgesDatatoGraph)
}
setGraphData(newGraphData)
} catch (ex) {
toast.error("An error occurred while loading Mizu Service Map, see console for mode details", { containerId: TOAST_CONTAINER_ID });
console.error(ex);
} finally {
setIsLoading(false)
}
// eslint-disable-next-line
}, [isOpen])
const mapNodesDatatoGraph = node => {
return {
id: node.id,
value: node.count,
label: (node.entry.name === "unresolved") ? node.name : `${node.entry.name} (${node.name})`,
title: "Count: " + node.name,
isResolved: node.entry.resolved
}
}
const mapEdgesDatatoGraph = edge => {
return {
from: edge.source.id,
to: edge.destination.id,
value: edge.count,
label: edge.count.toString(),
color: {
color: edge.protocol.backgroundColor,
highlight: edge.protocol.backgroundColor
},
}
}
const mapToKeyValForFilter = (arr) => arr.map(mapNodesDatatoGraph)
.map((edge) => { return { key: edge.label, value: edge.label } })
.sort((a, b) => { return a.key.localeCompare(b.key) });
const getServicesForFilter = useMemo(() => {
const resolved = mapToKeyValForFilter(serviceMapApiData.nodes?.filter(x => x.resolved))
const unResolved = mapToKeyValForFilter(serviceMapApiData.nodes?.filter(x => !x.resolved))
return [...resolved, ...unResolved]
}, [serviceMapApiData])
const filterServiceMap = (newProtocolsFilters?: any[], newServiceFilters?: string[]) => {
const filterProt = newProtocolsFilters || checkedProtocols
const filterService = newServiceFilters || checkedServices
setCheckedProtocols(filterProt)
setCheckedServices(filterService)
const newGraphData: GraphData = {
nodes: serviceMapApiData.nodes?.map(mapNodesDatatoGraph).filter(node => filterService.includes(node.label)),
edges: serviceMapApiData.edges?.filter(edge => filterProt.includes(edge.protocol.abbr)).map(mapEdgesDatatoGraph)
}
setGraphData(newGraphData);
}
useEffect(() => {
if (checkedServices.length > 0) return // only after refresh
filterServiceMap(checkedProtocols, getServicesForFilter.map(x => x.key).filter(serviceName => !Utils.isIpAddress(serviceName)))
}, [getServicesForFilter])
useEffect(() => {
getServiceMapData()
}, [getServiceMapData])
useEffect(() => {
if (graphData?.nodes?.length === 0) return;
let options = { ...graphOptions };
options.physics.barnesHut.avoidOverlap = graphData?.nodes?.length > 10 ? 0 : 1;
setGraphOptions(options);
// eslint-disable-next-line
}, [graphData?.nodes?.length])
const refreshServiceMap = debounce(() => {
getServiceMapData();
}, 500);
return (
<Modal
aria-labelledby="transition-modal-title"
aria-describedby="transition-modal-description"
open={isOpen}
onClose={onClose}
closeAfterTransition
BackdropComponent={Backdrop}
BackdropProps={{ timeout: 500 }}>
<Fade in={isOpen}>
<Box sx={modalStyle}>
<div className={styles.modalContainer}>
<div className={styles.filterSection}>
<Resizeable minWidth={170}>
<div className={styles.filterWrapper}>
<div className={styles.protocolsFilterList}>
<SelectList items={protocols} checkBoxWidth="5%" tableName={"Protocols"} multiSelect={true}
checkedValues={checkedProtocols} setCheckedValues={filterServiceMap} tableClassName={styles.filters} />
</div>
<div className={styles.separtorLine}></div>
<div className={styles.servicesFilter}>
<input className={commonClasses.textField + ` ${styles.servicesFilterSearch}`} placeholder="search service" value={servicesSearchVal} onChange={(event) => setServicesSearchVal(event.target.value)} />
<div className={styles.servicesFilterList}>
<SelectList items={getServicesForFilter} tableName={"Services"} tableClassName={styles.filters} multiSelect={true} searchValue={servicesSearchVal}
checkBoxWidth="5%" checkedValues={checkedServices} setCheckedValues={(newServicesForFilter) => filterServiceMap(null, newServicesForFilter)} />
</div>
</div>
</div>
</Resizeable>
</div>
<div className={styles.graphSection}>
<div style={{ display: "flex", justifyContent: "space-between" }}>
<Button style={{ marginLeft: "3%" }}
startIcon={<img src={refreshIcon} className="custom" alt="refresh" style={{ marginRight: "8%" }}></img>}
size="medium"
variant="contained"
className={commonClasses.outlinedButton + " " + commonClasses.imagedButton}
onClick={refreshServiceMap}
>
Refresh
</Button>
<img src={closeIcon} alt="close" onClick={() => onClose()} style={{ cursor: "pointer", userSelect: "none" }}></img>
</div>
{isLoading && <div className={spinnerStyle.spinnerContainer}>
<img alt="spinner" src={spinnerImg} style={{ height: 50 }} />
</div>}
{!isLoading && <div style={{ height: "100%", width: "100%" }}>
<Graph
graph={graphData}
options={graphOptions}
/>
</div>
}
</div>
</div>
</Box>
</Fade>
</Modal>
);
}

View File

@@ -0,0 +1,60 @@
export interface GraphData {
nodes: Node[];
edges: Edge[];
}
export interface Node {
id: number;
value: number;
label: string;
title?: string;
color?: object;
}
export interface Edge {
from: number;
to: number;
value: number;
label: string;
title?: string;
color?: object;
}
export interface ServiceMapNode {
id: number;
name: string;
entry: Entry;
count: number;
resolved: boolean;
}
export interface ServiceMapEdge {
source: ServiceMapNode;
destination: ServiceMapNode;
count: number;
protocol: Protocol;
}
export interface ServiceMapGraph {
nodes: ServiceMapNode[];
edges: ServiceMapEdge[];
}
export interface Entry {
ip: string;
port: string;
name: string;
}
export interface Protocol {
name: string;
abbr: string;
macro: string;
version: string;
backgroundColor: string;
foregroundColor: string;
fontSize: number;
referenceLink: string;
ports: string[];
priority: number;
}

View File

@@ -0,0 +1,83 @@
const ServiceMapOptions = {
physics: {
enabled: true,
solver: 'barnesHut',
barnesHut: {
theta: 0.5,
gravitationalConstant: -2000,
centralGravity: 0.3,
springLength: 180,
springConstant: 0.04,
damping: 0.09,
avoidOverlap: 0
},
},
layout: {
hierarchical: false,
randomSeed: 1 // always on node 1
},
nodes: {
shape: 'dot',
chosen: true,
color: {
background: '#27AE60',
border: '#000000',
highlight: {
background: '#27AE60',
border: '#000000',
},
},
font: {
color: '#343434',
size: 14, // px
face: 'arial',
background: 'none',
strokeWidth: 0, // px
strokeColor: '#ffffff',
align: 'center',
multi: false
},
borderWidth: 1.5,
borderWidthSelected: 2.5,
labelHighlightBold: true,
opacity: 1,
shadow: true,
},
edges: {
chosen: true,
dashes: false,
arrowStrikethrough: false,
arrows: {
to: {
enabled: true,
},
middle: {
enabled: false,
},
from: {
enabled: false,
}
},
smooth: {
enabled: true,
type: 'dynamic',
roundness: 1.0
},
font: {
color: '#343434',
size: 12, // px
face: 'arial',
background: 'none',
strokeWidth: 2, // px
strokeColor: '#ffffff',
align: 'horizontal',
multi: false,
},
labelHighlightBold: true,
selectionWidth: 1,
shadow: true,
},
autoResize: true,
};
export default ServiceMapOptions

View File

@@ -0,0 +1,4 @@
<svg width="20" height="20" viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M18.591 9.99997C18.591 14.7446 14.7447 18.5909 10.0001 18.5909C5.25546 18.5909 1.40918 14.7446 1.40918 9.99997C1.40918 5.25534 5.25546 1.40906 10.0001 1.40906C14.7447 1.40906 18.591 5.25534 18.591 9.99997Z" fill="#E9EBF8" stroke="#BCCEFD"/>
<path d="M13.1604 8.23038L11.95 7.01994L10.1392 8.83078L8.32832 7.01994L7.11789 8.23038L8.92872 10.0412L7.12046 11.8495L8.33089 13.0599L10.1392 11.2517L11.9474 13.0599L13.1579 11.8495L11.3496 10.0412L13.1604 8.23038Z" fill="#205CF5"/>
</svg>

After

Width:  |  Height:  |  Size: 588 B

View File

@@ -0,0 +1,3 @@
<svg width="26" height="26" viewBox="0 0 26 26" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M10.8337 11.9167H7.69308L7.69416 11.907C7.83561 11.2143 8.11247 10.5564 8.50883 9.97105C9.09865 9.10202 9.92598 8.42097 10.8922 8.00913C11.2193 7.87046 11.5606 7.7643 11.9083 7.69388C12.6297 7.54762 13.3731 7.54762 14.0945 7.69388C15.1312 7.90631 16.0825 8.41908 16.8299 9.1683L18.3639 7.63863C17.6725 6.94707 16.8546 6.39501 15.9546 6.01255C15.4956 5.81823 15.0184 5.67016 14.53 5.57055C13.5223 5.36581 12.4838 5.36581 11.4761 5.57055C10.9873 5.67057 10.5098 5.819 10.0504 6.01363C8.69682 6.58791 7.53808 7.54123 6.71374 8.7588C6.15895 9.5798 5.77099 10.5019 5.57191 11.4725C5.54158 11.6188 5.52533 11.7683 5.50366 11.9167H2.16699L6.50033 16.25L10.8337 11.9167ZM15.167 14.0834H18.3076L18.3065 14.092C18.0234 15.4806 17.205 16.7019 16.0282 17.4915C15.443 17.8882 14.7851 18.1651 14.0923 18.3062C13.3713 18.4525 12.6283 18.4525 11.9072 18.3062C11.2146 18.1648 10.5567 17.8879 9.97133 17.4915C9.68383 17.2971 9.41541 17.0758 9.16966 16.8307L7.63783 18.3625C8.32954 19.0539 9.14791 19.6056 10.0482 19.9875C10.5076 20.1825 10.9875 20.331 11.4728 20.4295C12.4801 20.6344 13.5184 20.6344 14.5257 20.4295C16.4676 20.0265 18.1757 18.8819 19.2869 17.2391C19.8412 16.4187 20.2288 15.4974 20.4277 14.5275C20.4569 14.3813 20.4742 14.2318 20.4959 14.0834H23.8337L19.5003 9.75005L15.167 14.0834Z" fill="#205CF5"/>
</svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

@@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" style="margin: auto; background: none; display: block; shape-rendering: auto;" width="200px" height="200px" viewBox="0 0 100 100" preserveAspectRatio="xMidYMid">
<circle cx="50" cy="50" fill="none" stroke="#1d3f72" stroke-width="10" r="35" stroke-dasharray="164.93361431346415 56.97787143782138" transform="rotate(275.903 50 50)">
<animateTransform attributeName="transform" type="rotate" repeatCount="indefinite" dur="1s" values="0 50 50;360 50 50" keyTimes="0;1"></animateTransform>
</circle>
<!-- [ldio] generated by https://loading.io/ --></svg>

After

Width:  |  Height:  |  Size: 673 B

View File

@@ -5,151 +5,217 @@ import Moment from 'moment';
import {EntryItem} from "./EntryListItem/EntryListItem";
import down from "assets/downImg.svg";
import spinner from 'assets/spinner.svg';
import {RecoilState, useRecoilState, useRecoilValue} from "recoil";
import {RecoilState, useRecoilState, useRecoilValue, useSetRecoilState} from "recoil";
import entriesAtom from "../../recoil/entries";
import queryAtom from "../../recoil/query";
import TrafficViewerApiAtom from "../../recoil/TrafficViewerApi";
import TrafficViewerApi from "./TrafficViewerApi";
import focusedEntryIdAtom from "../../recoil/focusedEntryId";
import {toast} from "react-toastify";
import {TOAST_CONTAINER_ID} from "../../configs/Consts";
import tappingStatusAtom from "../../recoil/tappingStatus";
import leftOffTopAtom from "../../recoil/leftOffTop";
interface EntriesListProps {
listEntryREF: any;
onSnapBrokenEvent: () => void;
isSnappedToBottom: boolean;
setIsSnappedToBottom: any;
queriedCurrent: number;
setQueriedCurrent: any;
queriedTotal: number;
setQueriedTotal: any;
startTime: number;
noMoreDataTop: boolean;
setNoMoreDataTop: (flag: boolean) => void;
leftOffTop: number;
setLeftOffTop: (leftOffTop: number) => void;
openWebSocket: (query: string, resetEntries: boolean) => void;
leftOffBottom: number;
truncatedTimestamp: number;
setTruncatedTimestamp: any;
scrollableRef: any;
ws: any;
listEntryREF: any;
onSnapBrokenEvent: () => void;
isSnappedToBottom: boolean;
setIsSnappedToBottom: any;
noMoreDataTop: boolean;
setNoMoreDataTop: (flag: boolean) => void;
openWebSocket: (query: string, resetEntries: boolean) => void;
scrollableRef: any;
ws: any;
}
export const EntriesList: React.FC<EntriesListProps> = ({listEntryREF, onSnapBrokenEvent, isSnappedToBottom, setIsSnappedToBottom, queriedCurrent, setQueriedCurrent, queriedTotal, setQueriedTotal, startTime, noMoreDataTop, setNoMoreDataTop, leftOffTop, setLeftOffTop, openWebSocket, leftOffBottom, truncatedTimestamp, setTruncatedTimestamp, scrollableRef, ws}) => {
export const EntriesList: React.FC<EntriesListProps> = ({
listEntryREF,
onSnapBrokenEvent,
isSnappedToBottom,
setIsSnappedToBottom,
noMoreDataTop,
setNoMoreDataTop,
openWebSocket,
scrollableRef,
ws
}) => {
const [entries, setEntries] = useRecoilState(entriesAtom);
const query = useRecoilValue(queryAtom);
const isWsConnectionClosed = ws?.current?.readyState !== WebSocket.OPEN;
const [entries, setEntries] = useRecoilState(entriesAtom);
const query = useRecoilValue(queryAtom);
const isWsConnectionClosed = ws?.current?.readyState !== WebSocket.OPEN;
const [focusedEntryId, setFocusedEntryId] = useRecoilState(focusedEntryIdAtom);
const [leftOffTop, setLeftOffTop] = useRecoilState(leftOffTopAtom);
const setTappingStatus = useSetRecoilState(tappingStatusAtom);
const trafficViewerApi = useRecoilValue(TrafficViewerApiAtom as RecoilState<TrafficViewerApi>)
const trafficViewerApi = useRecoilValue(TrafficViewerApiAtom as RecoilState<TrafficViewerApi>)
const [loadMoreTop, setLoadMoreTop] = useState(false);
const [isLoadingTop, setIsLoadingTop] = useState(false);
const [loadMoreTop, setLoadMoreTop] = useState(false);
const [isLoadingTop, setIsLoadingTop] = useState(false);
const [queriedTotal, setQueriedTotal] = useState(0);
const [startTime, setStartTime] = useState(0);
const [truncatedTimestamp, setTruncatedTimestamp] = useState(0);
useEffect(() => {
const list = document.getElementById('list').firstElementChild;
list.addEventListener('scroll', (e) => {
const el: any = e.target;
if(el.scrollTop === 0) {
setLoadMoreTop(true);
} else {
setNoMoreDataTop(false);
setLoadMoreTop(false);
}
});
}, [setLoadMoreTop, setNoMoreDataTop]);
const leftOffBottom = entries.length > 0 ? entries[entries.length - 1].id : "latest";
const memoizedEntries = useMemo(() => {
return entries;
},[entries]);
const getOldEntries = useCallback(async () => {
useEffect(() => {
const list = document.getElementById('list').firstElementChild;
list.addEventListener('scroll', (e) => {
const el: any = e.target;
if (el.scrollTop === 0) {
setLoadMoreTop(true);
} else {
setNoMoreDataTop(false);
setLoadMoreTop(false);
if (leftOffTop === null || leftOffTop <= 0) {
return;
}
setIsLoadingTop(true);
const data = await trafficViewerApi.fetchEntries(leftOffTop, -1, query, 100, 3000);
if (!data || data.data === null || data.meta === null) {
setNoMoreDataTop(true);
setIsLoadingTop(false);
return;
}
setLeftOffTop(data.meta.leftOff);
}
});
}, [setLoadMoreTop, setNoMoreDataTop]);
let scrollTo: boolean;
if (data.meta.leftOff === 0) {
setNoMoreDataTop(true);
scrollTo = false;
} else {
scrollTo = true;
}
setIsLoadingTop(false);
const memoizedEntries = useMemo(() => {
return entries;
}, [entries]);
const newEntries = [...data.data.reverse(), ...entries];
setEntries(newEntries);
const getOldEntries = useCallback(async () => {
setLoadMoreTop(false);
if (leftOffTop === "") {
return;
}
setIsLoadingTop(true);
const data = await trafficViewerApi.fetchEntries(leftOffTop, -1, query, 100, 3000);
if (!data || data.data === null || data.meta === null) {
setNoMoreDataTop(true);
setIsLoadingTop(false);
return;
}
setLeftOffTop(data.meta.leftOff);
setQueriedCurrent(queriedCurrent + data.meta.current);
setQueriedTotal(data.meta.total);
setTruncatedTimestamp(data.meta.truncatedTimestamp);
let scrollTo: boolean;
if (data.meta.leftOff === 0) {
setNoMoreDataTop(true);
scrollTo = false;
} else {
scrollTo = true;
}
setIsLoadingTop(false);
if (scrollTo) {
scrollableRef.current.scrollToIndex(data.data.length - 1);
}
},[setLoadMoreTop, setIsLoadingTop, entries, setEntries, query, setNoMoreDataTop, leftOffTop, setLeftOffTop, queriedCurrent, setQueriedCurrent, setQueriedTotal, setTruncatedTimestamp, scrollableRef]);
const newEntries = [...data.data.reverse(), ...entries];
if(newEntries.length > 10000) {
newEntries.splice(10000, newEntries.length - 10000)
}
setEntries(newEntries);
useEffect(() => {
if(!isWsConnectionClosed || !loadMoreTop || noMoreDataTop) return;
getOldEntries();
}, [loadMoreTop, noMoreDataTop, getOldEntries, isWsConnectionClosed]);
setQueriedTotal(data.meta.total);
setTruncatedTimestamp(data.meta.truncatedTimestamp);
const scrollbarVisible = scrollableRef.current?.childWrapperRef.current.clientHeight > scrollableRef.current?.wrapperRef.current.clientHeight;
if (scrollTo) {
scrollableRef.current.scrollToIndex(data.data.length - 1);
}
}, [setLoadMoreTop, setIsLoadingTop, entries, setEntries, query, setNoMoreDataTop, leftOffTop, setLeftOffTop, setQueriedTotal, setTruncatedTimestamp, scrollableRef]);
return <React.Fragment>
<div className={styles.list}>
<div id="list" ref={listEntryREF} className={styles.list}>
{isLoadingTop && <div className={styles.spinnerContainer}>
<img alt="spinner" src={spinner} style={{height: 25}}/>
</div>}
{noMoreDataTop && <div id="noMoreDataTop" className={styles.noMoreDataAvailable}>No more data available</div>}
<ScrollableFeedVirtualized ref={scrollableRef} itemHeight={48} marginTop={10} onSnapBroken={onSnapBrokenEvent}>
{false /* It's because the first child is ignored by ScrollableFeedVirtualized */}
{memoizedEntries.map(entry => <EntryItem
key={`entry-${entry.id}`}
entry={entry}
style={{}}
headingMode={false}
/>)}
</ScrollableFeedVirtualized>
<button type="button"
title="Fetch old records"
className={`${styles.btnOld} ${!scrollbarVisible && leftOffTop > 0 ? styles.showButton : styles.hideButton}`}
onClick={(_) => {
trafficViewerApi.webSocket.close()
getOldEntries();
}}>
<img alt="down" src={down} />
</button>
<button type="button"
title="Snap to bottom"
className={`${styles.btnLive} ${isSnappedToBottom && !isWsConnectionClosed ? styles.hideButton : styles.showButton}`}
onClick={(_) => {
if (isWsConnectionClosed) {
if (query) {
openWebSocket(`(${query}) and leftOff(${leftOffBottom})`, false);
} else {
openWebSocket(`leftOff(${leftOffBottom})`, false);
}
}
scrollableRef.current.jumpToBottom();
setIsSnappedToBottom(true);
}}>
<img alt="down" src={down} />
</button>
</div>
useEffect(() => {
if (!isWsConnectionClosed || !loadMoreTop || noMoreDataTop) return;
getOldEntries();
}, [loadMoreTop, noMoreDataTop, getOldEntries, isWsConnectionClosed]);
<div className={styles.footer}>
<div>Displaying <b id="entries-length">{entries?.length}</b> results out of <b id="total-entries">{queriedTotal}</b> total</div>
{startTime !== 0 && <div>Started listening at <span style={{marginRight: 5, fontWeight: 600, fontSize: 13}}>{Moment(truncatedTimestamp ? truncatedTimestamp : startTime).utc().format('MM/DD/YYYY, h:mm:ss.SSS A')}</span></div>}
</div>
</div>
</React.Fragment>;
const scrollbarVisible = scrollableRef.current?.childWrapperRef.current.clientHeight > scrollableRef.current?.wrapperRef.current.clientHeight;
if (ws.current) {
ws.current.onmessage = (e) => {
if (!e?.data) return;
const message = JSON.parse(e.data);
switch (message.messageType) {
case "entry":
const entry = message.data;
if (!focusedEntryId) setFocusedEntryId(entry.id);
const newEntries = [...entries, entry];
if (newEntries.length > 10000) {
setLeftOffTop(newEntries[0].id);
newEntries.splice(0, newEntries.length - 10000)
setNoMoreDataTop(false);
}
setEntries(newEntries);
break;
case "status":
setTappingStatus(message.tappingStatus);
break;
case "toast":
toast[message.data.type](message.data.text, {
theme: "colored",
autoClose: message.data.autoClose,
pauseOnHover: true,
progress: undefined,
containerId: TOAST_CONTAINER_ID
});
break;
case "queryMetadata":
setTruncatedTimestamp(message.data.truncatedTimestamp);
setQueriedTotal(message.data.total);
if (leftOffTop === "") {
setLeftOffTop(message.data.leftOff);
}
break;
case "startTime":
setStartTime(message.data);
break;
}
}
}
return <React.Fragment>
<div className={styles.list}>
<div id="list" ref={listEntryREF} className={styles.list}>
{isLoadingTop && <div className={styles.spinnerContainer}>
<img alt="spinner" src={spinner} style={{height: 25}}/>
</div>}
{noMoreDataTop && <div id="noMoreDataTop" className={styles.noMoreDataAvailable}>No more data available</div>}
<ScrollableFeedVirtualized ref={scrollableRef} itemHeight={48} marginTop={10} onSnapBroken={onSnapBrokenEvent}>
{false /* It's because the first child is ignored by ScrollableFeedVirtualized */}
{memoizedEntries.map(entry => <EntryItem
key={`entry-${entry.id}`}
entry={entry}
style={{}}
headingMode={false}
/>)}
</ScrollableFeedVirtualized>
<button type="button"
title="Fetch old records"
className={`${styles.btnOld} ${!scrollbarVisible && leftOffTop !== "" ? styles.showButton : styles.hideButton}`}
onClick={(_) => {
trafficViewerApi.webSocket.close()
getOldEntries();
}}>
<img alt="down" src={down}/>
</button>
<button type="button"
title="Snap to bottom"
className={`${styles.btnLive} ${isSnappedToBottom && !isWsConnectionClosed ? styles.hideButton : styles.showButton}`}
onClick={(_) => {
if (isWsConnectionClosed) {
if (query) {
openWebSocket(`(${query}) and leftOff("${leftOffBottom}")`, false);
} else {
openWebSocket(`leftOff("${leftOffBottom}")`, false);
}
}
scrollableRef.current.jumpToBottom();
setIsSnappedToBottom(true);
}}>
<img alt="down" src={down}/>
</button>
</div>
<div className={styles.footer}>
<div>Displaying <b id="entries-length">{entries?.length}</b> results out of <b
id="total-entries">{queriedTotal}</b> total
</div>
{startTime !== 0 && <div>Started listening at <span style={{
marginRight: 5,
fontWeight: 600,
fontSize: 13
}}>{Moment(truncatedTimestamp ? truncatedTimestamp : startTime).utc().format('MM/DD/YYYY, h:mm:ss.SSS A')}</span>
</div>}
</div>
</div>
</React.Fragment>;
};

View File

@@ -5,14 +5,14 @@ import { makeStyles } from "@material-ui/core";
import Protocol from "../UI/Protocol"
import Queryable from "../UI/Queryable";
import { toast } from "react-toastify";
import { RecoilState, useRecoilState, useRecoilValue } from "recoil";
import { RecoilState, useRecoilValue } from "recoil";
import focusedEntryIdAtom from "../../recoil/focusedEntryId";
import trafficViewerApi from "../../recoil/TrafficViewerApi";
import TrafficViewerApi from "./TrafficViewerApi";
import TrafficViewerApiAtom from "../../recoil/TrafficViewerApi/atom";
import queryAtom from "../../recoil/query/atom";
import useWindowDimensions, { useRequestTextByWidth } from "../../hooks/WindowDimensionsHook";
import { TOAST_CONTAINER_ID } from "../../configs/Consts";
import spinner from "assets/spinner.svg";
const useStyles = makeStyles(() => ({
entryTitle: {
@@ -89,12 +89,13 @@ const EntryTitle: React.FC<any> = ({ protocol, data, elapsedTime }) => {
</div>;
};
const EntrySummary: React.FC<any> = ({ entry }) => {
const EntrySummary: React.FC<any> = ({ entry, namespace }) => {
return <EntryItem
key={`entry-${entry.id}`}
entry={entry}
style={{}}
headingMode={true}
namespace={namespace}
/>;
};
@@ -105,12 +106,13 @@ export const EntryDetailed = () => {
const focusedEntryId = useRecoilValue(focusedEntryIdAtom);
const trafficViewerApi = useRecoilValue(TrafficViewerApiAtom as RecoilState<TrafficViewerApi>)
const query = useRecoilValue(queryAtom);
const [isLoading, setIsLoading] = useState(false);
const [entryData, setEntryData] = useState(null);
useEffect(() => {
if (!focusedEntryId) return;
setEntryData(null);
setIsLoading(true);
(async () => {
try {
const entryData = await trafficViewerApi.getEntry(focusedEntryId, query);
@@ -125,20 +127,23 @@ export const EntryDetailed = () => {
});
}
console.error(error);
} finally {
setIsLoading(false);
}
})();
// eslint-disable-next-line
}, [focusedEntryId]);
return <React.Fragment>
{entryData && <EntryTitle
{isLoading && <div style={{textAlign: "center", width: "100%", marginTop: 50}}><img alt="spinner" src={spinner} style={{height: 60}}/></div>}
{!isLoading && entryData && <EntryTitle
protocol={entryData.protocol}
data={entryData.data}
elapsedTime={entryData.data.elapsedTime}
/>}
{entryData && <EntrySummary entry={entryData.base} />}
{!isLoading && entryData && <EntrySummary entry={entryData.base} namespace={entryData.data.namespace} />}
<React.Fragment>
{entryData && <EntryViewer
{!isLoading && entryData && <EntryViewer
representation={entryData.representation}
isRulesEnabled={entryData.isRulesEnabled}
rulesMatched={entryData.rulesMatched}

View File

@@ -66,8 +66,10 @@
margin-top: -60px
.capture img
height: 20px
height: 14px
z-index: 1000
margin-top: 12px
margin-left: -2px
.endpointServiceContainer
display: flex
@@ -76,6 +78,7 @@
padding-right: 10px
padding-top: 4px
flex-grow: 1
padding-left: 10px
.separatorRight
display: flex

View File

@@ -52,6 +52,7 @@ interface EntryProps {
entry: Entry;
style: object;
headingMode: boolean;
namespace?: string;
}
enum CaptureTypes {
@@ -62,11 +63,11 @@ enum CaptureTypes {
Ebpf = "ebpf",
}
export const EntryItem: React.FC<EntryProps> = ({entry, style, headingMode}) => {
export const EntryItem: React.FC<EntryProps> = ({entry, style, headingMode, namespace}) => {
const [focusedEntryId, setFocusedEntryId] = useRecoilState(focusedEntryIdAtom);
const [queryState, setQuery] = useRecoilState(queryAtom);
const isSelected = focusedEntryId === entry.id.toString();
const isSelected = focusedEntryId === entry.id;
const classification = getClassification(entry.status)
const numberOfRules = entry.rules.numberOfRules
@@ -140,17 +141,15 @@ export const EntryItem: React.FC<EntryProps> = ({entry, style, headingMode}) =>
const isStatusCodeEnabled = ((entry.proto.name === "http" && "status" in entry) || entry.status !== 0);
let endpointServiceContainer = "10px";
if (!isStatusCodeEnabled) endpointServiceContainer = "20px";
return <React.Fragment>
<div
id={`entry-${entry.id.toString()}`}
id={`entry-${entry.id}`}
className={`${styles.row}
${isSelected && !rule && !contractEnabled ? styles.rowSelected : additionalRulesProperties}`}
onClick={() => {
if (!setFocusedEntryId) return;
setFocusedEntryId(entry.id.toString());
setFocusedEntryId(entry.id);
}}
style={{
border: isSelected && !headingMode ? `1px ${entry.proto.backgroundColor} solid` : "1px transparent solid",
@@ -178,7 +177,7 @@ export const EntryItem: React.FC<EntryProps> = ({entry, style, headingMode}) =>
{isStatusCodeEnabled && <div>
<StatusCode statusCode={entry.status} statusQuery={entry.statusQuery}/>
</div>}
<div className={styles.endpointServiceContainer} style={{paddingLeft: endpointServiceContainer}}>
<div className={styles.endpointServiceContainer}>
<Summary method={entry.method} methodQuery={entry.methodQuery} summary={entry.summary} summaryQuery={entry.summaryQuery}/>
<div className={styles.resolvedName}>
<Queryable
@@ -226,6 +225,19 @@ export const EntryItem: React.FC<EntryProps> = ({entry, style, headingMode}) =>
: ""
}
<div className={styles.separatorRight}>
{headingMode ? <Queryable
query={`namespace == "${namespace}"`}
displayIconOnMouseOver={true}
flipped={true}
iconStyle={{marginRight: "16px"}}
>
<span
className={`${styles.tcpInfo} ${styles.ip}`}
title="Namespace"
>
{namespace}
</span>
</Queryable> : null}
<Queryable
query={`src.ip == "${entry.src.ip}"`}
displayIconOnMouseOver={true}

View File

@@ -16,21 +16,21 @@ import trafficViewerApiAtom from "../../recoil/TrafficViewerApi"
interface FiltersProps {
backgroundColor: string
openWebSocket: (query: string, resetEntries: boolean) => void;
reopenConnection: any;
}
export const Filters: React.FC<FiltersProps> = ({backgroundColor, openWebSocket}) => {
export const Filters: React.FC<FiltersProps> = ({backgroundColor, reopenConnection}) => {
return <div className={styles.container}>
<QueryForm
backgroundColor={backgroundColor}
openWebSocket={openWebSocket}
reopenConnection={reopenConnection}
/>
</div>;
};
interface QueryFormProps {
backgroundColor: string
openWebSocket: (query: string, resetEntries: boolean) => void;
reopenConnection: any;
}
export const modalStyle = {
@@ -47,11 +47,10 @@ export const modalStyle = {
color: '#000',
};
export const QueryForm: React.FC<QueryFormProps> = ({backgroundColor, openWebSocket}) => {
export const QueryForm: React.FC<QueryFormProps> = ({backgroundColor, reopenConnection}) => {
const formRef = useRef<HTMLFormElement>(null);
const [query, setQuery] = useRecoilState(queryAtom);
const trafficViewerApi = useRecoilValue(trafficViewerApiAtom)
const [openModal, setOpenModal] = useState(false);
@@ -63,12 +62,7 @@ export const QueryForm: React.FC<QueryFormProps> = ({backgroundColor, openWebSoc
}
const handleSubmit = (e) => {
trafficViewerApi.webSocket.close()
if (query) {
openWebSocket(`(${query}) and leftOff(-1)`, true);
} else {
openWebSocket(`leftOff(-1)`, true);
}
reopenConnection();
e.preventDefault();
}

View File

@@ -6,7 +6,7 @@
flex-direction: column
overflow: hidden
flex-grow: 1
height: calc(100vh - 70px)
height: calc(100% - 70px)
.TrafficPageHeader
padding: 20px 24px
@@ -16,9 +16,8 @@
justify-content: space-between
.TrafficPageStreamStatus
display: flex
align-items: center
display: flex
align-items: center
.TrafficPageHeaderImage
width: 22px
@@ -113,4 +112,4 @@
.playPauseIcon
cursor: pointer
margin-right: 15px
height: 30px
height: 30px

View File

@@ -1,25 +1,27 @@
import React, { useEffect, useMemo, useRef, useState } from "react";
import { Filters } from "./Filters";
import { EntriesList } from "./EntriesList";
import { makeStyles } from "@material-ui/core";
import React, {useEffect, useMemo, useRef, useState} from "react";
import {Filters} from "./Filters";
import {EntriesList} from "./EntriesList";
import {makeStyles} from "@material-ui/core";
import TrafficViewerStyles from "./TrafficViewer.module.sass";
import styles from '../style/EntriesList.module.sass';
import { EntryDetailed } from "./EntryDetailed";
import {EntryDetailed} from "./EntryDetailed";
import playIcon from 'assets/run.svg';
import pauseIcon from 'assets/pause.svg';
import variables from '../../variables.module.scss';
import { toast, ToastContainer } from 'react-toastify';
import {ToastContainer} from 'react-toastify';
import debounce from 'lodash/debounce';
import { RecoilRoot, RecoilState, useRecoilState, useRecoilValue, useSetRecoilState } from "recoil";
import {RecoilRoot, RecoilState, useRecoilState, useRecoilValue, useSetRecoilState} from "recoil";
import entriesAtom from "../../recoil/entries";
import focusedEntryIdAtom from "../../recoil/focusedEntryId";
import queryAtom from "../../recoil/query";
import { TLSWarning } from "../TLSWarning/TLSWarning";
import {TLSWarning} from "../TLSWarning/TLSWarning";
import trafficViewerApiAtom from "../../recoil/TrafficViewerApi"
import TrafficViewerApi from "./TrafficViewerApi";
import { StatusBar } from "../UI/StatusBar";
import {StatusBar} from "../UI/StatusBar";
import tappingStatusAtom from "../../recoil/tappingStatus/atom";
import { TOAST_CONTAINER_ID } from "../../configs/Consts";
import {TOAST_CONTAINER_ID} from "../../configs/Consts";
import leftOffTopAtom from "../../recoil/leftOffTop";
import { DEFAULT_QUERY } from '../../hooks/useWS';
const useLayoutStyles = makeStyles(() => ({
details: {
@@ -48,34 +50,31 @@ interface TrafficViewerProps {
actionButtons?: JSX.Element,
isShowStatusBar?: boolean,
webSocketUrl: string,
isCloseWebSocket: boolean,
shouldCloseWebSocket: boolean,
setShouldCloseWebSocket: (flag: boolean) => void,
isDemoBannerView: boolean
}
export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus, trafficViewerApiProp,
actionButtons, isShowStatusBar, webSocketUrl,
isCloseWebSocket, isDemoBannerView }) => {
export const TrafficViewer: React.FC<TrafficViewerProps> = ({
setAnalyzeStatus, trafficViewerApiProp,
actionButtons, isShowStatusBar, webSocketUrl,
shouldCloseWebSocket, setShouldCloseWebSocket, isDemoBannerView
}) => {
const classes = useLayoutStyles();
const [entries, setEntries] = useRecoilState(entriesAtom);
const [focusedEntryId, setFocusedEntryId] = useRecoilState(focusedEntryIdAtom);
const setEntries = useSetRecoilState(entriesAtom);
const setFocusedEntryId = useSetRecoilState(focusedEntryIdAtom);
const query = useRecoilValue(queryAtom);
const setTrafficViewerApiState = useSetRecoilState(trafficViewerApiAtom as RecoilState<TrafficViewerApi>)
const [tappingStatus, setTappingStatus] = useRecoilState(tappingStatusAtom);
const [noMoreDataTop, setNoMoreDataTop] = useState(false);
const [isSnappedToBottom, setIsSnappedToBottom] = useState(true);
const [forceRender, setForceRender] = useState(0);
const [wsReadyState, setWsReadyState] = useState(0);
const [queryBackgroundColor, setQueryBackgroundColor] = useState("#f5f5f5");
const [queriedCurrent, setQueriedCurrent] = useState(0);
const [queriedTotal, setQueriedTotal] = useState(0);
const [leftOffBottom, setLeftOffBottom] = useState(0);
const [leftOffTop, setLeftOffTop] = useState(null);
const [truncatedTimestamp, setTruncatedTimestamp] = useState(0);
const [startTime, setStartTime] = useState(0);
const setLeftOffTop = useSetRecoilState(leftOffTopAtom);
const scrollableRef = useRef(null);
const [showTLSWarning, setShowTLSWarning] = useState(false);
@@ -107,116 +106,76 @@ export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus,
}, [query, handleQueryChange]);
useEffect(() => {
isCloseWebSocket && closeWebSocket()
}, [isCloseWebSocket])
if(shouldCloseWebSocket){
closeWebSocket()
setShouldCloseWebSocket(false);
}
}, [shouldCloseWebSocket])
useEffect(() => {
reopenConnection()
}, [webSocketUrl])
const ws = useRef(null);
const openEmptyWebSocket = () => {
if (query) {
openWebSocket(`(${query}) and ${DEFAULT_QUERY}`, true);
} else {
openWebSocket(DEFAULT_QUERY, true);
}
}
const closeWebSocket = () => {
if (ws?.current?.readyState === WebSocket.OPEN) {
ws.current.close();
return true;
}
}
const listEntry = useRef(null);
const openWebSocket = (query: string, resetEntries: boolean) => {
if (resetEntries) {
setFocusedEntryId(null);
setEntries([]);
setQueriedCurrent(0);
setLeftOffTop(null);
setLeftOffTop("");
setNoMoreDataTop(false);
}
try {
ws.current = new WebSocket(webSocketUrl);
sendQueryWhenWsOpen(query);
ws.current.onopen = () => {
setWsReadyState(ws?.current?.readyState);
}
ws.current.onclose = () => {
if (window.location.pathname === "/")
setForceRender(forceRender + 1);
setWsReadyState(ws?.current?.readyState);
}
ws.current.onerror = (event) => {
console.error("WebSocket error:", event);
if (ws?.current?.readyState === WebSocket.OPEN) {
ws.current.close();
}
if (query) {
openWebSocket(`(${query}) and leftOff(${leftOffBottom})`, false);
} else {
openWebSocket(`leftOff(${leftOffBottom})`, false);
}
}
} catch (e) { }
} catch (e) {
}
}
const sendQueryWhenWsOpen = (query) => {
setTimeout(() => {
if (ws?.current?.readyState === WebSocket.OPEN) {
ws.current.send(JSON.stringify({ "query": query, "enableFullEntries": false }));
ws.current.send(JSON.stringify({"query": query, "enableFullEntries": false}));
} else {
sendQueryWhenWsOpen(query);
}
}, 500)
}
const closeWebSocket = () => {
if (ws?.current?.readyState === WebSocket.OPEN) {
ws.current.close();
}
}
if (ws.current) {
ws.current.onmessage = (e) => {
if (!e?.data) return;
const message = JSON.parse(e.data);
switch (message.messageType) {
case "entry":
const entry = message.data;
if (!focusedEntryId) setFocusedEntryId(entry.id.toString());
const newEntries = [...entries, entry];
if (newEntries.length === 10001) {
setLeftOffTop(newEntries[0].entry.id);
newEntries.shift();
setNoMoreDataTop(false);
}
setEntries(newEntries);
break;
case "status":
setTappingStatus(message.tappingStatus);
break;
case "analyzeStatus":
setAnalyzeStatus(message.analyzeStatus);
break;
case "outboundLink":
onTLSDetected(message.Data.DstIP);
break;
case "toast":
toast[message.data.type](message.data.text, {
theme: "colored",
autoClose: message.data.autoClose,
pauseOnHover: true,
progress: undefined,
containerId: TOAST_CONTAINER_ID
});
break;
case "queryMetadata":
setQueriedCurrent(queriedCurrent + message.data.current);
setQueriedTotal(message.data.total);
setLeftOffBottom(message.data.leftOff);
setTruncatedTimestamp(message.data.truncatedTimestamp);
if (leftOffTop === null) {
setLeftOffTop(message.data.leftOff - 1);
}
break;
case "startTime":
setStartTime(message.data);
break;
default:
console.error(
`unsupported websocket message type, Got: ${message.messageType}`
);
}
};
}
useEffect(() => {
setTrafficViewerApiState({ ...trafficViewerApiProp, webSocket: { close: closeWebSocket } });
setTrafficViewerApiState({...trafficViewerApiProp, webSocket: {close: closeWebSocket}});
(async () => {
openWebSocket("leftOff(-1)", true);
try {
const tapStatusResponse = await trafficViewerApiProp.tapStatus();
setTappingStatus(tapStatusResponse);
@@ -228,53 +187,48 @@ export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus,
console.error(error);
}
})()
// eslint-disable-next-line
}, []);
const toggleConnection = () => {
if (ws?.current?.readyState === WebSocket.OPEN) {
ws?.current?.close();
} else {
if (query) {
openWebSocket(`(${query}) and leftOff(-1)`, true);
} else {
openWebSocket(`leftOff(-1)`, true);
}
if (!closeWebSocket()) {
openEmptyWebSocket();
scrollableRef.current.jumpToBottom();
setIsSnappedToBottom(true);
}
}
const reopenConnection = async () => {
closeWebSocket()
openEmptyWebSocket();
scrollableRef.current.jumpToBottom();
setIsSnappedToBottom(true);
}
useEffect(() => {
return () => {
ws.current.close();
if (ws?.current?.readyState === WebSocket.OPEN) {
ws.current.close();
}
};
}, []);
const onTLSDetected = (destAddress: string) => {
addressesWithTLS.add(destAddress);
setAddressesWithTLS(new Set(addressesWithTLS));
if (!userDismissedTLSWarning) {
setShowTLSWarning(true);
}
};
const getConnectionIndicator = () => {
switch (ws?.current?.readyState) {
switch (wsReadyState) {
case WebSocket.OPEN:
return <div className={`${TrafficViewerStyles.indicatorContainer} ${TrafficViewerStyles.greenIndicatorContainer}`}>
<div className={`${TrafficViewerStyles.indicator} ${TrafficViewerStyles.greenIndicator}`} />
return <div
className={`${TrafficViewerStyles.indicatorContainer} ${TrafficViewerStyles.greenIndicatorContainer}`}>
<div className={`${TrafficViewerStyles.indicator} ${TrafficViewerStyles.greenIndicator}`}/>
</div>
default:
return <div className={`${TrafficViewerStyles.indicatorContainer} ${TrafficViewerStyles.redIndicatorContainer}`}>
<div className={`${TrafficViewerStyles.indicator} ${TrafficViewerStyles.redIndicator}`} />
return <div
className={`${TrafficViewerStyles.indicatorContainer} ${TrafficViewerStyles.redIndicatorContainer}`}>
<div className={`${TrafficViewerStyles.indicator} ${TrafficViewerStyles.redIndicator}`}/>
</div>
}
}
const getConnectionTitle = () => {
switch (ws?.current?.readyState) {
switch (wsReadyState) {
case WebSocket.OPEN:
return "streaming live traffic"
default:
@@ -291,13 +245,16 @@ export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus,
return (
<div className={TrafficViewerStyles.TrafficPage}>
{tappingStatus && isShowStatusBar && <StatusBar isDemoBannerView={isDemoBannerView} />}
{tappingStatus && isShowStatusBar && <StatusBar isDemoBannerView={isDemoBannerView}/>}
<div className={TrafficViewerStyles.TrafficPageHeader}>
<div className={TrafficViewerStyles.TrafficPageStreamStatus}>
<img className={TrafficViewerStyles.playPauseIcon} style={{ visibility: ws?.current?.readyState === WebSocket.OPEN ? "visible" : "hidden" }} alt="pause"
src={pauseIcon} onClick={toggleConnection} />
<img className={TrafficViewerStyles.playPauseIcon} style={{ position: "absolute", visibility: ws?.current?.readyState === WebSocket.OPEN ? "hidden" : "visible" }} alt="play"
src={playIcon} onClick={toggleConnection} />
<img className={TrafficViewerStyles.playPauseIcon}
style={{visibility: wsReadyState === WebSocket.OPEN ? "visible" : "hidden"}} alt="pause"
src={pauseIcon} onClick={toggleConnection}/>
<img className={TrafficViewerStyles.playPauseIcon}
style={{position: "absolute", visibility: wsReadyState === WebSocket.OPEN ? "hidden" : "visible"}}
alt="play"
src={playIcon} onClick={toggleConnection}/>
<div className={TrafficViewerStyles.connectionText}>
{getConnectionTitle()}
{getConnectionIndicator()}
@@ -309,8 +266,7 @@ export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus,
<div className={TrafficViewerStyles.TrafficPageListContainer}>
<Filters
backgroundColor={queryBackgroundColor}
openWebSocket={openWebSocket}
reopenConnection={reopenConnection}
/>
<div className={styles.container}>
<EntriesList
@@ -318,56 +274,48 @@ export const TrafficViewer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus,
onSnapBrokenEvent={onSnapBrokenEvent}
isSnappedToBottom={isSnappedToBottom}
setIsSnappedToBottom={setIsSnappedToBottom}
queriedCurrent={queriedCurrent}
setQueriedCurrent={setQueriedCurrent}
queriedTotal={queriedTotal}
setQueriedTotal={setQueriedTotal}
startTime={startTime}
noMoreDataTop={noMoreDataTop}
setNoMoreDataTop={setNoMoreDataTop}
leftOffTop={leftOffTop}
setLeftOffTop={setLeftOffTop}
openWebSocket={openWebSocket}
leftOffBottom={leftOffBottom}
truncatedTimestamp={truncatedTimestamp}
setTruncatedTimestamp={setTruncatedTimestamp}
scrollableRef={scrollableRef}
ws={ws}
/>
</div>
</div>
<div className={classes.details} id="rightSideContainer">
{focusedEntryId && <EntryDetailed />}
<EntryDetailed/>
</div>
</div>}
<TLSWarning showTLSWarning={showTLSWarning}
setShowTLSWarning={setShowTLSWarning}
addressesWithTLS={addressesWithTLS}
setAddressesWithTLS={setAddressesWithTLS}
userDismissedTLSWarning={userDismissedTLSWarning}
setUserDismissedTLSWarning={setUserDismissedTLSWarning} />
setShowTLSWarning={setShowTLSWarning}
addressesWithTLS={addressesWithTLS}
setAddressesWithTLS={setAddressesWithTLS}
userDismissedTLSWarning={userDismissedTLSWarning}
setUserDismissedTLSWarning={setUserDismissedTLSWarning}/>
</div>
);
};
const MemoiedTrafficViewer = React.memo(TrafficViewer)
const TrafficViewerContainer: React.FC<TrafficViewerProps> = ({ setAnalyzeStatus, trafficViewerApiProp,
actionButtons, isShowStatusBar = true,
webSocketUrl, isCloseWebSocket, isDemoBannerView }) => {
const TrafficViewerContainer: React.FC<TrafficViewerProps> = ({
setAnalyzeStatus, trafficViewerApiProp,
actionButtons, isShowStatusBar = true,
webSocketUrl, shouldCloseWebSocket, setShouldCloseWebSocket, isDemoBannerView
}) => {
return <RecoilRoot>
<MemoiedTrafficViewer actionButtons={actionButtons} isShowStatusBar={isShowStatusBar} webSocketUrl={webSocketUrl}
isCloseWebSocket={isCloseWebSocket} trafficViewerApiProp={trafficViewerApiProp}
setAnalyzeStatus={setAnalyzeStatus} isDemoBannerView={isDemoBannerView} />
shouldCloseWebSocket={shouldCloseWebSocket} setShouldCloseWebSocket={setShouldCloseWebSocket} trafficViewerApiProp={trafficViewerApiProp}
setAnalyzeStatus={setAnalyzeStatus} isDemoBannerView={isDemoBannerView}/>
<ToastContainer enableMultiContainer containerId={TOAST_CONTAINER_ID}
position="bottom-right"
autoClose={5000}
hideProgressBar={false}
newestOnTop={false}
closeOnClick
rtl={false}
pauseOnFocusLoss
draggable
pauseOnHover />
position="bottom-right"
autoClose={5000}
hideProgressBar={false}
newestOnTop={false}
closeOnClick
rtl={false}
pauseOnFocusLoss
draggable
pauseOnHover/>
</RecoilRoot>
}

View File

@@ -1,18 +1,17 @@
import React, { CSSProperties } from "react";
import infoImg from 'assets/info.svg';
import styles from "./style/InformationIcon.module.sass"
const DEFUALT_LINK = "https://getmizu.io/docs"
export interface InformationIconProps{
export interface InformationIconProps {
link?: string,
style? : CSSProperties
style?: CSSProperties
}
export const InformationIcon: React.FC<InformationIconProps> = ({link,style}) => {
export const InformationIcon: React.FC<InformationIconProps> = ({ link, style }) => {
return <React.Fragment>
<a href={DEFUALT_LINK ? DEFUALT_LINK : link} style={style} className={styles.flex} title="documentation" target="_blank">
<img className="headerIcon" src={infoImg} alt="Info icon"/>
<a href={DEFUALT_LINK ? DEFUALT_LINK : link} style={style} className={styles.linkStyle} title="documentation" target="_blank">
<span>Docs</span>
</a>
</React.Fragment>
}

View File

@@ -0,0 +1,20 @@
import React from "react";
import circleImg from 'assets/dotted-circle.svg';
import styles from './style/NoDataMessage.module.sass'
export interface Props {
messageText: string;
}
const NoDataMessage: React.FC<Props> = ({ messageText = "No data found" }) => {
return (
<div data-cy="noDataMessage" className={styles.messageContainer__noData}>
<div className={styles.container}>
<img src={circleImg} alt="No data Found"></img>
<div className={styles.messageContainer__noDataMessage}>{messageText}</div>
</div>
</div>
);
};
export default NoDataMessage;

View File

@@ -54,7 +54,7 @@ const Protocol: React.FC<ProtocolProps> = ({protocol, horizontal}) => {
backgroundColor: protocol.backgroundColor,
color: protocol.foregroundColor,
fontSize: protocol.fontSize,
marginRight: "-20px",
marginRight: "-6px",
}}
title={protocol.longName}
>

View File

@@ -0,0 +1,17 @@
import React from "react";
export interface Props {
checked: boolean;
onToggle: (checked: boolean) => any;
disabled?: boolean;
}
const Radio: React.FC<Props> = ({ checked, onToggle, disabled, ...props }) => {
return (
<div>
<input style={!disabled ? { cursor: "pointer" } : {}} type="radio" checked={checked} disabled={disabled} onChange={(event) => onToggle(event.target.checked)} {...props} />
</div>
);
};
export default Radio;

View File

@@ -0,0 +1,61 @@
import React, { useRef, useState } from "react";
import styles from './style/Resizeable.module.sass'
export interface Props {
children
minWidth: number
}
const Resizeable: React.FC<Props> = ({ children, minWidth }) => {
const resizeble = useRef(null)
let mousePos = { x: 0, y: 0 }
let elementDimention = { w: 0, h: 0 }
let isPressed = false
const [elemWidth, setElemWidth] = useState(resizeble?.current?.style?.width)
const mouseDownHandler = function (e) {
// Get the current mouse position
mousePos = { x: e.clientX, y: e.clientY }
isPressed = true
// Calculate the dimension of element
const styles = resizeble.current.getBoundingClientRect();
elementDimention = { w: parseInt(styles.width, 10), h: parseInt(styles.height, 10) }
// Attach the listeners to `document`
window.addEventListener('mousemove', mouseMoveHandler);
window.addEventListener('mouseup', mouseUpHandler);
};
const mouseMoveHandler = function (e) {
if (isPressed) {
// How far the mouse has been moved
const dx = e.clientX - mousePos.x;
const widthEl = elementDimention.w + dx
if (widthEl >= minWidth)
// Adjust the dimension of element
setElemWidth(widthEl)
}
};
const mouseUpHandler = function () {
window.removeEventListener('mousemove', mouseMoveHandler);
window.removeEventListener('mouseup', mouseUpHandler);
isPressed = false
};
return (
<React.Fragment>
<div className={styles.resizable} ref={resizeble} style={{ width: elemWidth }}>
{children}
<div className={`${styles.resizer} ${styles.resizerRight}`} onMouseDown={mouseDownHandler}></div>
{/* <div className={`${styles.resizer} ${styles.resizerB}`} onMouseDown={mouseDownHandler}></div> -- FutureUse*/}
</div>
</React.Fragment>
);
};
export default Resizeable;

View File

@@ -0,0 +1,113 @@
import React, { useCallback, useEffect, useMemo, useState } from "react";
import Radio from "./Radio";
import styles from './style/SelectList.module.sass'
import NoDataMessage from "./NoDataMessage";
import Checkbox from "./Checkbox";
export interface Props {
items;
tableName: string;
checkedValues?: string[];
multiSelect: boolean;
searchValue?: string;
setCheckedValues: (newValues) => void;
tableClassName?
checkBoxWidth?: string
}
const SelectList: React.FC<Props> = ({ items, tableName, checkedValues = [], multiSelect = true, searchValue = "", setCheckedValues, tableClassName,
checkBoxWidth = 50 }) => {
const noItemsMessage = "No items to show";
const [headerChecked, setHeaderChecked] = useState(false)
const filteredValues = useMemo(() => {
return items.filter((listValue) => listValue?.value?.includes(searchValue));
}, [items, searchValue])
const filteredValuesKeys = useMemo(() => {
return filteredValues.map(x => x.key)
}, [filteredValues])
const toggleValue = (checkedKey) => {
if (!multiSelect) {
const newCheckedValues = [];
newCheckedValues.push(checkedKey);
setCheckedValues(newCheckedValues);
}
else {
const newCheckedValues = [...checkedValues];
let index = newCheckedValues.indexOf(checkedKey);
if (index > -1)
newCheckedValues.splice(index, 1);
else
newCheckedValues.push(checkedKey);
setCheckedValues(newCheckedValues);
}
}
useEffect(() => {
const setAllChecked = filteredValuesKeys.every(val => checkedValues.includes(val))
setHeaderChecked(setAllChecked)
}, [filteredValuesKeys, checkedValues])
const toggleAll = useCallback((shouldCheckAll) => {
let newChecked = checkedValues.filter(x => !filteredValuesKeys.includes(x))
if (shouldCheckAll) {
const disabledItems = items.filter(i => i.disabled).map(x => x.key)
newChecked = [...filteredValuesKeys, ...newChecked].filter(x => !disabledItems.includes(x))
}
setCheckedValues(newChecked)
}, [searchValue, checkedValues, filteredValuesKeys])
const dataFieldFunc = (listValue) => listValue.component ? listValue.component :
<span className={styles.nowrap} title={listValue.value}>
{listValue.value}
</span>
const tableHead = multiSelect ? <tr style={{ borderBottomWidth: "2px" }}>
<th style={{ width: checkBoxWidth }}><Checkbox data-cy="checkbox-all" checked={headerChecked}
onToggle={(isChecked) => toggleAll(isChecked)} /></th>
<th>{tableName}</th>
</tr> :
<tr style={{ borderBottomWidth: "2px" }}>
<th>{tableName}</th>
</tr>
const tableBody = filteredValues.length === 0 ?
<tr>
<td colSpan={2}>
<NoDataMessage messageText={noItemsMessage} />
</td>
</tr>
:
filteredValues?.map(listValue => {
return <tr key={listValue.key}>
<td style={{ width: checkBoxWidth }}>
{multiSelect && <Checkbox data-cy={"checkbox-" + listValue.value} disabled={listValue.disabled} checked={checkedValues.includes(listValue.key)} onToggle={() => toggleValue(listValue.key)} />}
{!multiSelect && <Radio data-cy={"radio-" + listValue.value} disabled={listValue.disabled} checked={checkedValues.includes(listValue.key)} onToggle={() => toggleValue(listValue.key)} />}
</td>
<td>
{dataFieldFunc(listValue)}
</td>
</tr>
}
)
return <div className={tableClassName ? tableClassName + ` ${styles.selectListTable}` : ` ${styles.selectListTable}`}>
<table cellPadding={5} style={{ borderCollapse: "collapse" }}>
<thead>
{tableHead}
</thead>
<tbody>
{tableBody}
</tbody>
</table>
</div>
}
export default SelectList;

View File

@@ -14,11 +14,10 @@ interface StatusBarProps {
isDemoBannerView: boolean;
}
export const StatusBar = ({isDemoBannerView}) => {
export const StatusBar: React.FC<StatusBarProps> = ({isDemoBannerView}) => {
const tappingStatus = useRecoilValue(tappingStatusAtom);
const [expandedBar, setExpandedBar] = useState(false);
const {uniqueNamespaces, amountOfPods, amountOfTappedPods, amountOfUntappedPods} = useRecoilValue(tappingStatusDetails);
return <div className={`${isDemoBannerView ? `${style.banner}` : ''} ${style.statusBar} ${(expandedBar ? `${style.expandedStatusBar}` : "")}`} onMouseOver={() => setExpandedBar(true)} onMouseLeave={() => setExpandedBar(false)} data-cy="expandedStatusBar">
<div className={style.podsCount}>
{tappingStatus.some(pod => !pod.isTapped) && <img src={warningIcon} alt="warning"/>}
@@ -39,7 +38,7 @@ export const StatusBar = ({isDemoBannerView}) => {
{tappingStatus.map(pod => <tr key={pod.name}>
<td style={{width: "40%"}}>{pod.name}</td>
<td style={{width: "40%"}}>{pod.namespace}</td>
<td style={{width: "20%", textAlign: "center"}}><img style={{height: 20}} alt="status" src={pod.isTapped ? successIcon : failIcon}/></td>
<td style={{width: "20%", textAlign: "center"}}>{pod.isTapped ? <img style={{height: 20}} alt="status" src={successIcon}/> : <img style={{height: 20}} alt="status" src={failIcon}/>}</td>
</tr>)}
</tbody>
</table>

View File

@@ -37,9 +37,9 @@ export function getClassification(statusCode: number): string {
// 1 - 16 HTTP/2 (gRPC) status codes
// 2xx - 5xx HTTP/1.x status codes
if ((statusCode >= 200 && statusCode <= 399) || statusCode === 0) {
if (statusCode >= 200 && statusCode <= 399) {
classification = StatusCodeClassification.SUCCESS;
} else if (statusCode >= 400 || (statusCode >= 1 && statusCode <= 16)) {
} else if (statusCode >= 400) {
classification = StatusCodeClassification.FAILURE;
}

View File

@@ -0,0 +1,3 @@
<svg width="55" height="55" viewBox="0 0 55 55" fill="none" xmlns="http://www.w3.org/2000/svg">
<circle cx="27.5" cy="27.5" r="27" stroke="#BCCEFD" stroke-dasharray="6 6"/>
</svg>

After

Width:  |  Height:  |  Size: 180 B

View File

@@ -1,5 +0,0 @@
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M19 21H6.14286C5.07143 21 4 20.32 4 18.96C4 17.6 5.07143 16.92 6.14286 16.92H19V4H6.14286C5.07143 4 4 5.02 4 6.04V18.96M16.8571 17.6V20.32V17.6Z" stroke="#627EF7" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
<rect x="8" y="7" width="7" height="2" fill="#627EF7"/>
<rect x="8" y="11" width="4" height="2" fill="#627EF7"/>
</svg>

Before

Width:  |  Height:  |  Size: 454 B

Some files were not shown because too many files have changed in this diff Show More