mirror of
https://github.com/projectdiscovery/nuclei.git
synced 2025-12-29 22:23:02 +00:00
Merge pull request #1271 from projectdiscovery/dev
v2.5.4 Release preparation
This commit is contained in:
commit
e0e48837c6
2
.github/ISSUE_TEMPLATE/issue-report.md
vendored
2
.github/ISSUE_TEMPLATE/issue-report.md
vendored
@ -33,4 +33,4 @@ Example: steps to reproduce the behavior:
|
||||
|
||||
|
||||
### Anything else:
|
||||
<!-- Links? References? Screnshots? Anything that will give us more context about the issue that you are encountering! -->
|
||||
<!-- Links? References? Screenshots? Anything that will give us more context about the issue that you are encountering! -->
|
||||
|
||||
29
.github/workflows/template-validate.yml
vendored
Normal file
29
.github/workflows/template-validate.yml
vendored
Normal file
@ -0,0 +1,29 @@
|
||||
name: 🛠 Template Validate
|
||||
|
||||
on: [ push, pull_request ]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@master
|
||||
- uses: actions/setup-go@v2
|
||||
with:
|
||||
go-version: 1.17
|
||||
|
||||
- name: Cache Go
|
||||
id: cache-go
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: /home/runner/go
|
||||
key: ${{ runner.os }}-go
|
||||
|
||||
- name: Installing Nuclei
|
||||
if: steps.cache-go.outputs.cache-hit != 'true'
|
||||
run: |
|
||||
go install github.com/projectdiscovery/nuclei/v2/cmd/nuclei@latest
|
||||
|
||||
- name: Template Validation
|
||||
run: |
|
||||
nuclei -validate
|
||||
nuclei -validate -w ./workflows
|
||||
604
DESIGN.md
Normal file
604
DESIGN.md
Normal file
@ -0,0 +1,604 @@
|
||||
# Nuclei Architecture Document
|
||||
|
||||
A brief overview of Nuclei Engine architecture. This document will be kept updated as the engine progresses.
|
||||
|
||||
## pkg/templates
|
||||
|
||||
### Template
|
||||
|
||||
Template is the basic unit of input to the engine which describes the requests to be made, matching to be done, data to extract, etc.
|
||||
|
||||
The template structure is described here. Template level attributes are defined here as well as convenience methods to validate, parse and compile templates creating executers.
|
||||
|
||||
Any attributes etc. required for the template, engine or requests to function are also set here.
|
||||
|
||||
Workflows are also compiled, their templates are loaded and compiled as well. Any validations etc. on the paths provided are also done here.
|
||||
|
||||
`Parse` function is the main entry point which returns a template for a `filePath` and `executorOptions`. It compiles all the requests for the templates, all the workflows, as well as any self-contained request etc. It also caches the templates in an in-memory cache.
|
||||
|
||||
### Preprocessors
|
||||
|
||||
Preprocessors are also applied here which can do things at template level. They get data of the template which they can alter at will on runtime. This is used in the engine to do random string generation.
|
||||
|
||||
Custom processor can be used if they satisfy the following interface.
|
||||
|
||||
```go
|
||||
type Preprocessor interface {
|
||||
Process(data []byte) []byte
|
||||
}
|
||||
```
|
||||
|
||||
## pkg/model
|
||||
|
||||
Model package implements Information structure for Nuclei Templates. `Info` contains all major metadata information for the template. `Classification` structure can also be used to provide additional context to vulnerability data.
|
||||
|
||||
It also specifies a `WorkflowLoader` interface that is used during workflow loading in template compilation stage.
|
||||
|
||||
```go
|
||||
type WorkflowLoader interface {
|
||||
GetTemplatePathsByTags(tags []string) []string
|
||||
GetTemplatePaths(templatesList []string, noValidate bool) []string
|
||||
}
|
||||
```
|
||||
|
||||
## pkg/protocols
|
||||
|
||||
Protocols package implements all the request protocols supported by Nuclei. This includes http, dns, network, headless and file requests as of now.
|
||||
|
||||
### Request
|
||||
|
||||
It exposes a `Request` interface that is implemented by all the request protocols supported.
|
||||
|
||||
```go
|
||||
// Request is an interface implemented any protocol based request generator.
|
||||
type Request interface {
|
||||
Compile(options *ExecuterOptions) error
|
||||
Requests() int
|
||||
GetID() string
|
||||
Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string)
|
||||
Extract(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{}
|
||||
ExecuteWithResults(input string, dynamicValues, previous output.InternalEvent, callback OutputEventCallback) error
|
||||
MakeResultEventItem(wrapped *output.InternalWrappedEvent) *output.ResultEvent
|
||||
MakeResultEvent(wrapped *output.InternalWrappedEvent) []*output.ResultEvent
|
||||
GetCompiledOperators() []*operators.Operators
|
||||
}
|
||||
```
|
||||
|
||||
Many of these methods are similar across protocols while some are very protocol specific.
|
||||
|
||||
A brief overview of the methods is provided below -
|
||||
|
||||
- **Compile** - Compiles the request with provided options.
|
||||
- **Requests** - Returns total requests made.
|
||||
- **GetID** - Returns any ID for the request
|
||||
- **Match** - Used to perform matching for patterns using matchers
|
||||
- **Extract** - Used to perform extraction for patterns using extractors
|
||||
- **ExecuteWithResults** - Request execution function for input.
|
||||
- **MakeResultEventItem** - Creates a single result event for the intermediate `InternalWrappedEvent` output structure.
|
||||
- **MakeResultEvent** - Returns a slice of results based on an `InternalWrappedEvent` internal output event.
|
||||
- **GetCompiledOperators** - Returns the compiled operators for the request.
|
||||
|
||||
`MakeDefaultResultEvent` function can be used as a default for `MakeResultEvent` function when no protocol-specific features need to be implemented for result generation.
|
||||
|
||||
For reference protocol requests implementations, one can look at the below packages -
|
||||
|
||||
1. [pkg/protocols/http](./v2/pkg/protocols/http)
|
||||
2. [pkg/protocols/dns](./v2/pkg/protocols/dns)
|
||||
3. [pkg/protocols/network](./v2/pkg/protocols/network)
|
||||
|
||||
### Executer
|
||||
|
||||
All these different requests interfaces are converted to an Executer which is also an interface defined in `pkg/protocols` which is used during final execution of the template.
|
||||
|
||||
```go
|
||||
// Executer is an interface implemented any protocol based request executer.
|
||||
type Executer interface {
|
||||
Compile() error
|
||||
Requests() int
|
||||
Execute(input string) (bool, error)
|
||||
ExecuteWithResults(input string, callback OutputEventCallback) error
|
||||
}
|
||||
```
|
||||
|
||||
The `ExecuteWithResults` function accepts a callback, which gets provided with results during execution in form of `*output.InternalWrappedEvent` structure.
|
||||
|
||||
The default executer is provided in `pkg/protocols/common/executer` . It takes a list of Requests and relevant `ExecuterOptions` and implements the Executer interface required for template execution. The executer during Template compilation process is created from this package and used as-is.
|
||||
|
||||
A different executer is the Clustered Requests executer which implements the Nuclei Request clustering functionality in `pkg/templates` We have a single HTTP request in cases where multiple templates can be clustered and multiple operator lists to match/extract. The first HTTP request is executed while all the template matcher/extractor are evaluated separately.
|
||||
|
||||
For Workflow execution, a separate RunWorkflow function is used which executes the workflow independently of the template execution.
|
||||
|
||||
With this basic premise set, we can now start exploring the current runner implementation which will also walk us through the architecture of nuclei.
|
||||
|
||||
## internal/runner
|
||||
|
||||
### Template loading
|
||||
|
||||
The first process after all CLI specific initialisation is the loading of template/workflow paths that the user wants to run. This is done by the packages described below.
|
||||
|
||||
#### pkg/catalog
|
||||
|
||||
This package is used to get paths using mixed syntax. It takes a template directory and performs resolving for template paths both from provided template and current user directory.
|
||||
|
||||
The syntax is very versatile and can include filenames, glob patterns, directories, absolute paths, and relative-paths.
|
||||
|
||||
|
||||
|
||||
Next step is the initialisation of the reporting modules which is handled in `pkg/reporting`.
|
||||
|
||||
#### pkg/reporting
|
||||
|
||||
Reporting module contains exporters and trackers as well as a module for deduplication and a module for result formatting.
|
||||
|
||||
Exporters and Trackers are interfaces defined in pkg/reporting.
|
||||
|
||||
```go
|
||||
// Tracker is an interface implemented by an issue tracker
|
||||
type Tracker interface {
|
||||
CreateIssue(event *output.ResultEvent) error
|
||||
}
|
||||
|
||||
// Exporter is an interface implemented by an issue exporter
|
||||
type Exporter interface {
|
||||
Close() error
|
||||
Export(event *output.ResultEvent) error
|
||||
}
|
||||
```
|
||||
|
||||
Exporters include `Elasticsearch`, `markdown`, `sarif` . Trackers include `GitHub` , `Gitlab` and `Jira`.
|
||||
|
||||
Each exporter and trackers implement their own configuration in YAML format and are very modular in nature, so adding new ones is easy.
|
||||
|
||||
|
||||
|
||||
After reading all the inputs from various sources and initialisation other miscellaneous options, the next bit is the output writing which is done using `pkg/output` module.
|
||||
|
||||
#### pkg/output
|
||||
|
||||
Output package implements the output writing functionality for Nuclei.
|
||||
|
||||
Output Writer implements the Writer interface which is called each time a result is found for nuclei.
|
||||
|
||||
```go
|
||||
// Writer is an interface which writes output to somewhere for nuclei events.
|
||||
type Writer interface {
|
||||
Close()
|
||||
Colorizer() aurora.Aurora
|
||||
Write(*ResultEvent) error
|
||||
Request(templateID, url, requestType string, err error)
|
||||
}
|
||||
```
|
||||
|
||||
ResultEvent structure is passed to the Nuclei Output Writer which contains the entire detail of a found result. Various intermediary types like `InternalWrappedEvent` and `InternalEvent` are used throughout nuclei protocols and matchers to describe results in various stages of execution.
|
||||
|
||||
|
||||
|
||||
Interactsh is also initialised if it is not explicitly disabled.
|
||||
|
||||
#### pkg/protocols/common/interactsh
|
||||
|
||||
Interactsh module is used to provide automatic Out-of-Band vulnerability identification in Nuclei.
|
||||
|
||||
It uses two LRU caches, one for storing interactions for request URLs and one for storing requests for interaction URL. These both caches are used to correlated requests received to the Interactsh OOB server and Nuclei Instance. [Interactsh Client](https://github.com/projectdiscovery/interactsh/pkg/client) package does most of the heavy lifting of this module.
|
||||
|
||||
Polling for interactions and server registration only starts when a template uses the interactsh module and is executed by nuclei. After that no registration is required for the entire run.
|
||||
|
||||
|
||||
|
||||
### RunEnumeration
|
||||
|
||||
Next we arrive in the `RunEnumeration` function of the runner.
|
||||
|
||||
`HostErrorsCache` is initialised which is used throughout the run of Nuclei enumeration to keep track of errors per host and skip further requests if the errors are greater than the provided threshold. The functionality for the error tracking cache is defined in [hosterrorscache.go](https://github.com/projectdiscovery/nuclei/blob/master/v2/pkg/protocols/common/hosterrorscache/hosterrorscache.go) and is pretty simplistic in nature.
|
||||
|
||||
Next the `WorkflowLoader` is initialised which used to load workflows. It exists in `v2/pkg/parsers/workflow_loader.go`
|
||||
|
||||
The loader is initialised moving forward which is responsible for Using Catalog, Passed Tags, Filters, Paths, etc. to return compiled `Templates` and `Workflows`.
|
||||
|
||||
#### pkg/catalog/loader
|
||||
|
||||
First the input passed by the user as paths is normalised to absolute paths which is done by the `pkg/catalog` module. Next the path filter module is used to remove the excluded template/workflows paths.
|
||||
|
||||
`pkg/parsers` module's `LoadTemplate`,`LoadWorkflow` functions are used to check if the templates pass the validation + are not excluded via tags/severity/etc. filters. If all checks are passed, then the template/workflow is parsed and returned in a compiled form by the `pkg/templates`'s `Parse` function.
|
||||
|
||||
`Parse` function performs compilation of all the requests in a template + creates Executers from them returning a runnable Template/Workflow structure.
|
||||
|
||||
Clustering module comes in next whose job is to cluster identical HTTP GET requests together (as a lot of the templates perform the same get requests many times, it's a good way to save many requests on large scans with lots of templates).
|
||||
|
||||
### pkg/operators
|
||||
|
||||
Operators package implements all the matching and extracting logic of Nuclei.
|
||||
|
||||
```go
|
||||
// Operators contain the operators that can be applied on protocols
|
||||
type Operators struct {
|
||||
Matchers []*matchers.Matcher
|
||||
Extractors []*extractors.Extractor
|
||||
MatchersCondition string
|
||||
}
|
||||
```
|
||||
|
||||
A protocol only needs to embed the `operators.Operators` type shown above, and it can utilise all the matching/extracting functionality of nuclei.
|
||||
|
||||
```go
|
||||
// MatchFunc performs matching operation for a matcher on model and returns true or false.
|
||||
type MatchFunc func(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string)
|
||||
|
||||
// ExtractFunc performs extracting operation for an extractor on model and returns true or false.
|
||||
type ExtractFunc func(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{}
|
||||
|
||||
// Execute executes the operators on data and returns a result structure
|
||||
func (operators *Operators) Execute(data map[string]interface{}, match MatchFunc, extract ExtractFunc, isDebug bool) (*Result, bool)
|
||||
```
|
||||
|
||||
The core of this process is the Execute function which takes an input dictionary as well as a Match and Extract function and return a `Result` structure which is used later during nuclei execution to check for results.
|
||||
|
||||
```go
|
||||
// Result is a result structure created from operators running on data.
|
||||
type Result struct {
|
||||
Matched bool
|
||||
Extracted bool
|
||||
Matches map[string][]string
|
||||
Extracts map[string][]string
|
||||
OutputExtracts []string
|
||||
DynamicValues map[string]interface{}
|
||||
PayloadValues map[string]interface{}
|
||||
}
|
||||
```
|
||||
|
||||
The internal logics for matching and extracting for things like words, regexes, jq, paths, etc. is specified in `pkg/operators/matchers`, `pkg/operators/extractors`. Those packages should be investigated for further look into the topic.
|
||||
|
||||
|
||||
### Template Execution
|
||||
|
||||
`pkg/core` provides the engine mechanism which runs the templates/workflows on inputs. It exposes an `Execute` function which does the task of execution while also doing template clustring. The clustering can also be disbled optionally by the user.
|
||||
|
||||
An example of using the core engine is provided below.
|
||||
|
||||
```go
|
||||
engine := core.New(r.options)
|
||||
engine.SetExecuterOptions(executerOpts)
|
||||
results := engine.ExecuteWithOpts(finalTemplates, r.hmapInputProvider, true)
|
||||
```
|
||||
|
||||
### Using Nuclei From Go Code
|
||||
|
||||
An example of using Nuclei From Go Code to run templates on targets is provided below.
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"path"
|
||||
|
||||
"github.com/logrusorgru/aurora"
|
||||
"go.uber.org/ratelimit"
|
||||
|
||||
"github.com/projectdiscovery/goflags"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/core"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/core/inputs"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/output"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/hosterrorscache"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/interactsh"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolinit"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolstate"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/reporting"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
func main() {
|
||||
cache := hosterrorscache.New(30, hosterrorscache.DefaultMaxHostsCount)
|
||||
defer cache.Close()
|
||||
|
||||
mockProgress := &testutils.MockProgressClient{}
|
||||
reportingClient, _ := reporting.New(&reporting.Options{}, "")
|
||||
defer reportingClient.Close()
|
||||
|
||||
outputWriter := testutils.NewMockOutputWriter()
|
||||
outputWriter.WriteCallback = func(event *output.ResultEvent) {
|
||||
fmt.Printf("Got Result: %v\n", event)
|
||||
}
|
||||
|
||||
defaultOpts := types.DefaultOptions()
|
||||
protocolstate.Init(defaultOpts)
|
||||
protocolinit.Init(defaultOpts)
|
||||
|
||||
defaultOpts.Templates = goflags.StringSlice{"dns/cname-service-detection.yaml"}
|
||||
defaultOpts.ExcludeTags = config.ReadIgnoreFile().Tags
|
||||
|
||||
interactOpts := interactsh.NewDefaultOptions(outputWriter, reportingClient, mockProgress)
|
||||
interactClient, err := interactsh.New(interactOpts)
|
||||
if err != nil {
|
||||
log.Fatalf("Could not create interact client: %s\n", err)
|
||||
}
|
||||
defer interactClient.Close()
|
||||
|
||||
home, _ := os.UserHomeDir()
|
||||
catalog := catalog.New(path.Join(home, "nuclei-templates"))
|
||||
executerOpts := protocols.ExecuterOptions{
|
||||
Output: outputWriter,
|
||||
Options: defaultOpts,
|
||||
Progress: mockProgress,
|
||||
Catalog: catalog,
|
||||
IssuesClient: reportingClient,
|
||||
RateLimiter: ratelimit.New(150),
|
||||
Interactsh: interactClient,
|
||||
HostErrorsCache: cache,
|
||||
Colorizer: aurora.NewAurora(true),
|
||||
}
|
||||
engine := core.New(defaultOpts)
|
||||
engine.SetExecuterOptions(executerOpts)
|
||||
|
||||
workflowLoader, err := parsers.NewLoader(&executerOpts)
|
||||
if err != nil {
|
||||
log.Fatalf("Could not create workflow loader: %s\n", err)
|
||||
}
|
||||
executerOpts.WorkflowLoader = workflowLoader
|
||||
|
||||
store, err := loader.New(loader.NewConfig(defaultOpts, catalog, executerOpts))
|
||||
if err != nil {
|
||||
log.Fatalf("Could not create loader client: %s\n", err)
|
||||
}
|
||||
store.Load()
|
||||
|
||||
input := &inputs.SimpleInputProvider{Inputs: []string{"docs.hackerone.com"}}
|
||||
_ = engine.Execute(store.Templates(), input)
|
||||
}
|
||||
```
|
||||
|
||||
### Adding a New Protocol
|
||||
|
||||
Protocols form the core of Nuclei Engine. All the request types like `http`, `dns`, etc. are implemented in form of protocol requests.
|
||||
|
||||
A protocol must implement the `Protocol` and `Request` interfaces described above in `pkg/protocols`. We'll take the example of an existing protocol implementation - websocket for this short reference around Nuclei internals.
|
||||
|
||||
The code for the websocket protocol is contained in `pkg/protocols/others/websocket`.
|
||||
|
||||
Below a high level skeleton of the websocket implementation is provided with all the important parts present.
|
||||
|
||||
```go
|
||||
package websocket
|
||||
|
||||
// Request is a request for the Websocket protocol
|
||||
type Request struct {
|
||||
// Operators for the current request go here.
|
||||
operators.Operators `yaml:",inline,omitempty"`
|
||||
CompiledOperators *operators.Operators `yaml:"-"`
|
||||
|
||||
// description: |
|
||||
// Address contains address for the request
|
||||
Address string `yaml:"address,omitempty" jsonschema:"title=address for the websocket request,description=Address contains address for the request"`
|
||||
|
||||
// declarations here
|
||||
}
|
||||
|
||||
// Compile compiles the request generators preparing any requests possible.
|
||||
func (r *Request) Compile(options *protocols.ExecuterOptions) error {
|
||||
r.options = options
|
||||
|
||||
// request compilation here as well as client creation
|
||||
|
||||
if len(r.Matchers) > 0 || len(r.Extractors) > 0 {
|
||||
compiled := &r.Operators
|
||||
if err := compiled.Compile(); err != nil {
|
||||
return errors.Wrap(err, "could not compile operators")
|
||||
}
|
||||
r.CompiledOperators = compiled
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Requests returns the total number of requests the rule will perform
|
||||
func (r *Request) Requests() int {
|
||||
if r.generator != nil {
|
||||
return r.generator.NewIterator().Total()
|
||||
}
|
||||
return 1
|
||||
}
|
||||
|
||||
// GetID returns the ID for the request if any.
|
||||
func (r *Request) GetID() string {
|
||||
return ""
|
||||
}
|
||||
|
||||
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
|
||||
func (r *Request) ExecuteWithResults(input string, dynamicValues, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
|
||||
// payloads init here
|
||||
if err := r.executeRequestWithPayloads(input, hostname, value, previous, callback); err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
|
||||
func (r *Request) executeRequestWithPayloads(input, hostname string, dynamicValues, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
|
||||
header := http.Header{}
|
||||
|
||||
// make the actual request here after setting all options
|
||||
|
||||
event := eventcreator.CreateEventWithAdditionalOptions(r, data, r.options.Options.Debug || r.options.Options.DebugResponse, func(internalWrappedEvent *output.InternalWrappedEvent) {
|
||||
internalWrappedEvent.OperatorsResult.PayloadValues = payloadValues
|
||||
})
|
||||
if r.options.Options.Debug || r.options.Options.DebugResponse {
|
||||
responseOutput := responseBuilder.String()
|
||||
gologger.Debug().Msgf("[%s] Dumped Websocket response for %s", r.options.TemplateID, input)
|
||||
gologger.Print().Msgf("%s", responsehighlighter.Highlight(event.OperatorsResult, responseOutput, r.options.Options.NoColor))
|
||||
}
|
||||
|
||||
callback(event)
|
||||
return nil
|
||||
}
|
||||
|
||||
func (r *Request) MakeResultEventItem(wrapped *output.InternalWrappedEvent) *output.ResultEvent {
|
||||
data := &output.ResultEvent{
|
||||
TemplateID: types.ToString(r.options.TemplateID),
|
||||
TemplatePath: types.ToString(r.options.TemplatePath),
|
||||
// ... setting more values for result event
|
||||
}
|
||||
return data
|
||||
}
|
||||
|
||||
// Match performs matching operation for a matcher on model and returns:
|
||||
// true and a list of matched snippets if the matcher type is supports it
|
||||
// otherwise false and an empty string slice
|
||||
func (r *Request) Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string) {
|
||||
return protocols.MakeDefaultMatchFunc(data, matcher)
|
||||
}
|
||||
|
||||
// Extract performs extracting operation for an extractor on model and returns true or false.
|
||||
func (r *Request) Extract(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{} {
|
||||
return protocols.MakeDefaultExtractFunc(data, matcher)
|
||||
}
|
||||
|
||||
// MakeResultEvent creates a result event from internal wrapped event
|
||||
func (r *Request) MakeResultEvent(wrapped *output.InternalWrappedEvent) []*output.ResultEvent {
|
||||
return protocols.MakeDefaultResultEvent(r, wrapped)
|
||||
}
|
||||
|
||||
// GetCompiledOperators returns a list of the compiled operators
|
||||
func (r *Request) GetCompiledOperators() []*operators.Operators {
|
||||
return []*operators.Operators{r.CompiledOperators}
|
||||
}
|
||||
|
||||
// Type returns the type of the protocol request
|
||||
func (r *Request) Type() templateTypes.ProtocolType {
|
||||
return templateTypes.WebsocketProtocol
|
||||
}
|
||||
```
|
||||
|
||||
Almost all of these protocols have boilerplate functions for which default implementations have been provided in the `providers` package. Examples are the implementation of `Match`, `Extract`, `MakeResultEvent`, GetCompiledOperators`, etc. which are almost same throughout Nuclei protocols code. It is enough to copy-paste them unless customization is required.
|
||||
|
||||
`eventcreator` package offers `CreateEventWithAdditionalOptions` function which can be used to create result events after doing request execution.
|
||||
|
||||
Step by step description of how to add a new protocol to Nuclei -
|
||||
|
||||
1. Add the protocol implementation in `pkg/protocols` directory. If it's a small protocol with fewer options, considering adding it to the `pkg/protocols/others` directory. Add the enum for the new protocol to `v2/pkg/templates/types/types.go`.
|
||||
|
||||
2. Add the protocol request structure to the `Template` structure fields. This is done in `pkg/templates/templates.go` with the corresponding import line.
|
||||
|
||||
```go
|
||||
|
||||
import (
|
||||
...
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/others/websocket"
|
||||
)
|
||||
|
||||
// Template is a YAML input file which defines all the requests and
|
||||
// other metadata for a template.
|
||||
type Template struct {
|
||||
...
|
||||
// description: |
|
||||
// Websocket contains the Websocket request to make in the template.
|
||||
RequestsWebsocket []*websocket.Request `yaml:"websocket,omitempty" json:"websocket,omitempty" jsonschema:"title=websocket requests to make,description=Websocket requests to make for the template"`
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
Also add the protocol case to the `Type` function as well as the `TemplateTypes` array in the same `templates.go` file.
|
||||
|
||||
```go
|
||||
// TemplateTypes is a list of accepted template types
|
||||
var TemplateTypes = []string{
|
||||
...
|
||||
"websocket",
|
||||
}
|
||||
|
||||
// Type returns the type of the template
|
||||
func (t *Template) Type() templateTypes.ProtocolType {
|
||||
...
|
||||
case len(t.RequestsWebsocket) > 0:
|
||||
return templateTypes.WebsocketProtocol
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. Add the protocol request to the `Requests` function and `compileProtocolRequests` function in the `compile.go` file in same directory.
|
||||
|
||||
```go
|
||||
|
||||
// Requests return the total request count for the template
|
||||
func (template *Template) Requests() int {
|
||||
return len(template.RequestsDNS) +
|
||||
...
|
||||
len(template.RequestsSSL) +
|
||||
len(template.RequestsWebsocket)
|
||||
}
|
||||
|
||||
|
||||
// compileProtocolRequests compiles all the protocol requests for the template
|
||||
func (template *Template) compileProtocolRequests(options protocols.ExecuterOptions) error {
|
||||
...
|
||||
|
||||
case len(template.RequestsWebsocket) > 0:
|
||||
requests = template.convertRequestToProtocolsRequest(template.RequestsWebsocket)
|
||||
}
|
||||
template.Executer = executer.NewExecuter(requests, &options)
|
||||
return nil
|
||||
}
|
||||
```
|
||||
|
||||
That's it, you've added a new protocol to Nuclei. The next good step would be to write integration tests which are described in `integration-tests` and `cmd/integration-tests` directories.
|
||||
|
||||
## Project Structure
|
||||
|
||||
- [v2/pkg/reporting](./v2/pkg/reporting) - Reporting modules for nuclei.
|
||||
- [v2/pkg/reporting/exporters/sarif](./v2/pkg/reporting/exporters/sarif) - Sarif Result Exporter
|
||||
- [v2/pkg/reporting/exporters/markdown](./v2/pkg/reporting/exporters/markdown) - Markdown Result Exporter
|
||||
- [v2/pkg/reporting/exporters/es](./v2/pkg/reporting/exporters/es) - Elasticsearch Result Exporter
|
||||
- [v2/pkg/reporting/dedupe](./v2/pkg/reporting/dedupe) - Dedupe module for Results
|
||||
- [v2/pkg/reporting/trackers/gitlab](./v2/pkg/reporting/trackers/gitlab) - Gitlab Issue Tracker Exporter
|
||||
- [v2/pkg/reporting/trackers/jira](./v2/pkg/reporting/trackers/jira) - Jira Issue Tracker Exporter
|
||||
- [v2/pkg/reporting/trackers/github](./v2/pkg/reporting/trackers/github) - GitHub Issue Tracker Exporter
|
||||
- [v2/pkg/reporting/format](./v2/pkg/reporting/format) - Result Formatting Functions
|
||||
- [v2/pkg/parsers](./v2/pkg/parsers) - Implements template as well as workflow loader for initial template discovery, validation and - loading.
|
||||
- [v2/pkg/types](./v2/pkg/types) - Contains CLI options as well as misc helper functions.
|
||||
- [v2/pkg/progress](./v2/pkg/progress) - Progress tracking
|
||||
- [v2/pkg/operators](./v2/pkg/operators) - Operators for Nuclei
|
||||
- [v2/pkg/operators/common/dsl](./v2/pkg/operators/common/dsl) - DSL functions for Nuclei YAML Syntax
|
||||
- [v2/pkg/operators/matchers](./v2/pkg/operators/matchers) - Matchers implementation
|
||||
- [v2/pkg/operators/extractors](./v2/pkg/operators/extractors) - Extractors implementation
|
||||
- [v2/pkg/catalog](./v2/pkg/catalog) - Template loading from disk helpers
|
||||
- [v2/pkg/catalog/config](./v2/pkg/catalog/config) - Internal configuration management
|
||||
- [v2/pkg/catalog/loader](./v2/pkg/catalog/loader) - Implements loading and validation of templates and workflows.
|
||||
- [v2/pkg/catalog/loader/filter](./v2/pkg/catalog/loader/filter) - Filter filters templates based on tags and paths
|
||||
- [v2/pkg/output](./v2/pkg/output) - Output module for nuclei
|
||||
- [v2/pkg/workflows](./v2/pkg/workflows) - Workflow execution logic + declarations
|
||||
- [v2/pkg/utils](./v2/pkg/utils) - Utility functions
|
||||
- [v2/pkg/model](./v2/pkg/model) - Template Info + misc
|
||||
- [v2/pkg/templates](./v2/pkg/templates) - Templates core starting point
|
||||
- [v2/pkg/templates/cache](./v2/pkg/templates/cache) - Templates cache
|
||||
- [v2/pkg/protocols](./v2/pkg/protocols) - Protocol Specification
|
||||
- [v2/pkg/protocols/file](./v2/pkg/protocols/file) - File protocol
|
||||
- [v2/pkg/protocols/network](./v2/pkg/protocols/network) - Network protocol
|
||||
- [v2/pkg/protocols/common/expressions](./v2/pkg/protocols/common/expressions) - Expression evaluation + Templating Support
|
||||
- [v2/pkg/protocols/common/interactsh](./v2/pkg/protocols/common/interactsh) - Interactsh integration
|
||||
- [v2/pkg/protocols/common/generators](./v2/pkg/protocols/common/generators) - Payload support for Requests (Sniper, etc.)
|
||||
- [v2/pkg/protocols/common/executer](./v2/pkg/protocols/common/executer) - Default Template Executer
|
||||
- [v2/pkg/protocols/common/replacer](./v2/pkg/protocols/common/replacer) - Template replacement helpers
|
||||
- [v2/pkg/protocols/common/helpers/eventcreator](./v2/pkg/protocols/common/helpers/eventcreator) - Result event creator
|
||||
- [v2/pkg/protocols/common/helpers/responsehighlighter](./v2/pkg/protocols/common/helpers/responsehighlighter) - Debug response highlighter
|
||||
- [v2/pkg/protocols/common/helpers/deserialization](./v2/pkg/protocols/common/helpers/deserialization) - Deserialization helper functions
|
||||
- [v2/pkg/protocols/common/hosterrorscache](./v2/pkg/protocols/common/hosterrorscache) - Host errors cache for tracking erroring hosts
|
||||
- [v2/pkg/protocols/offlinehttp](./v2/pkg/protocols/offlinehttp) - Offline http protocol
|
||||
- [v2/pkg/protocols/http](./v2/pkg/protocols/http) - HTTP protocol
|
||||
- [v2/pkg/protocols/http/race](./v2/pkg/protocols/http/race) - HTTP Race Module
|
||||
- [v2/pkg/protocols/http/raw](./v2/pkg/protocols/http/raw) - HTTP Raw Request Support
|
||||
- [v2/pkg/protocols/headless](./v2/pkg/protocols/headless) - Headless Module
|
||||
- [v2/pkg/protocols/headless/engine](./v2/pkg/protocols/headless/engine) - Internal Headless implementation
|
||||
- [v2/pkg/protocols/dns](./v2/pkg/protocols/dns) - DNS protocol
|
||||
- [v2/pkg/projectfile](./v2/pkg/projectfile) - Project File Implementation
|
||||
|
||||
### Notes
|
||||
|
||||
1. The matching as well as interim output functionality is a bit complex, we should simplify it a bit as well.
|
||||
@ -1,7 +1,7 @@
|
||||
FROM golang:1.17.2-alpine as build-env
|
||||
FROM golang:1.17.3-alpine as build-env
|
||||
RUN go install -v github.com/projectdiscovery/nuclei/v2/cmd/nuclei@latest
|
||||
|
||||
FROM alpine:3.14
|
||||
FROM alpine:3.15.0
|
||||
RUN apk add --no-cache bind-tools ca-certificates chromium
|
||||
COPY --from=build-env /go/bin/nuclei /usr/local/bin/nuclei
|
||||
ENTRYPOINT ["nuclei"]
|
||||
|
||||
30
README.md
30
README.md
@ -63,7 +63,7 @@ go install -v github.com/projectdiscovery/nuclei/v2/cmd/nuclei@latest
|
||||
|
||||
### Nuclei Templates
|
||||
|
||||
Nuclei has had built-in support for automatic template download/update as default since version [v2.5.2](https://github.com/projectdiscovery/nuclei/releases/tag/v2.5.2). [**Nuclei-Templates**](https://github.com/projectdiscovery/nuclei-templates) project provides a community-contributed list of ready-to-use templates that is constantly updated.
|
||||
Nuclei has built-in support for automatic template download/update as default since version [v2.5.2](https://github.com/projectdiscovery/nuclei/releases/tag/v2.5.2). [**Nuclei-Templates**](https://github.com/projectdiscovery/nuclei-templates) project provides a community-contributed list of ready-to-use templates that is constantly updated.
|
||||
|
||||
You may still use the `update-templates` flag to update the nuclei templates at any time; You can write your own checks for your individual workflow and needs following Nuclei's [templating guide](https://nuclei.projectdiscovery.io/templating-guide/).
|
||||
|
||||
@ -96,19 +96,23 @@ TARGET:
|
||||
|
||||
TEMPLATES:
|
||||
-t, -templates string[] template or template directory paths to include in the scan
|
||||
-tu, -template-url string[] URL containing list of templates to run
|
||||
-nt, -new-templates run only new templates added in latest nuclei-templates release
|
||||
-w, -workflows string[] workflow or workflow directory paths to include in the scan
|
||||
-wu, -workflow-url string[] URL containing list of workflows to run
|
||||
-validate validate the passed templates to nuclei
|
||||
-tl list all available templates
|
||||
|
||||
FILTERING:
|
||||
-tags string[] execute a subset of templates that contain the provided tags
|
||||
-etags, -exclude-tags string[] exclude templates with the provided tags
|
||||
-itags, -include-tags string[] tags from the default deny list that permit executing more intrusive templates
|
||||
-et, -exclude-templates string[] template or template directory paths to exclude
|
||||
-etags, -exclude-tags string[] exclude templates with the provided tags
|
||||
-it, -include-templates string[] templates to be executed even if they are excluded either by default or configuration
|
||||
-s, -severity value[] Templates to run based on severity. Possible values - info,low,medium,high,critical
|
||||
-es, -exclude-severity value[] Templates to exclude based on severity. Possible values - info,low,medium,high,critical
|
||||
-et, -exclude-templates string[] template or template directory paths to exclude
|
||||
-s, -severity value[] Templates to run based on severity. Possible values info,low,medium,high,critical
|
||||
-es, -exclude-severity value[] Templates to exclude based on severity. Possible values info,low,medium,high,critical
|
||||
-pt, -type value[] protocol types to be executed. Possible values dns,file,http,headless,network,workflow,ssl,websocket
|
||||
-ept, -exclude-type value[] protocol types to not be executed. Possible values dns,file,http,headless,network,workflow,ssl,websocket
|
||||
-a, -author string[] execute templates that are (co-)created by the specified authors
|
||||
|
||||
OUTPUT:
|
||||
@ -120,6 +124,7 @@ OUTPUT:
|
||||
-nm, -no-meta don't display match metadata
|
||||
-nts, -no-timestamp don't display timestamp metadata in CLI output
|
||||
-rdb, -report-db string local nuclei reporting database (always use this to persist report data)
|
||||
-ms, -matcher-status show optional match failure status
|
||||
-me, -markdown-export string directory to export results in markdown format
|
||||
-se, -sarif-export string file to export results in SARIF format
|
||||
|
||||
@ -132,6 +137,9 @@ CONFIGURATIONS:
|
||||
-sr, -system-resolvers use system DNS resolving as error fallback
|
||||
-passive enable passive HTTP response processing mode
|
||||
-ev, -env-vars enable environment variables to be used in template
|
||||
-cc, -client-cert string client certificate file (PEM-encoded) used for authenticating against scanned hosts
|
||||
-ck, -client-key string client key file (PEM-encoded) used for authenticating against scanned hosts
|
||||
-ca, -client-ca string client certificate authority file (PEM-encoded) used for authenticating against scanned hosts
|
||||
|
||||
INTERACTSH:
|
||||
-iserver, -interactsh-server string interactsh server url for self-hosted instance (default "https://interactsh.com")
|
||||
@ -147,6 +155,8 @@ RATE-LIMIT:
|
||||
-rlm, -rate-limit-minute int maximum number of requests to send per minute
|
||||
-bs, -bulk-size int maximum number of hosts to be analyzed in parallel per template (default 25)
|
||||
-c, -concurrency int maximum number of templates to be executed in parallel (default 25)
|
||||
-hbs, -headless-bulk-size int maximum number of headless hosts to be analyzed in parallel per template (default 10)
|
||||
-hc, -headless-concurrency int maximum number of headless templates to be executed in parallel (default 10)
|
||||
|
||||
OPTIMIZATIONS:
|
||||
-timeout int time to wait in seconds before timeout (default 5)
|
||||
@ -167,12 +177,12 @@ DEBUG:
|
||||
-debug show all requests and responses
|
||||
-debug-req show all sent requests
|
||||
-debug-resp show all received responses
|
||||
-proxy, -proxy-url string URL of the HTTP proxy server
|
||||
-proxy-socks-url string URL of the SOCKS proxy server
|
||||
-p, -proxy string[] List of HTTP(s)/SOCKS5 proxy to use (comma separated or file input)
|
||||
-tlog, -trace-log string file to write sent requests trace log
|
||||
-elog, -error-log string file to write sent requests error log
|
||||
-version show nuclei version
|
||||
-v, -verbose show verbose output
|
||||
-vv display extra verbose information
|
||||
-vv display templates loaded for scan
|
||||
-tv, -templates-version shows the version of the installed nuclei-templates
|
||||
|
||||
UPDATE:
|
||||
@ -252,7 +262,7 @@ Please check our other open-source projects that might fit into your bug bounty
|
||||
|
||||
Nuclei immensely improve how you approach security assessment by augmenting the manual, repetitive processes. Consultancies are already converting their manual assessment steps with Nuclei, it allows them to run set of their custom assessment approach across thousands of hosts in an automated manner.
|
||||
|
||||
Pen-testers get the full power of our public templates and customization capabilities to speed-up their assessment process, and specifically with the regression cycle where you can easily verify the fix.
|
||||
Pen-testers get the full power of our public templates and customization capabilities to speed up their assessment process, and specifically with the regression cycle where you can easily verify the fix.
|
||||
|
||||
- Easily create your compliance, standards suite (e.g. OWASP Top 10) checklist.
|
||||
- With capabilities like [fuzz](https://nuclei.projectdiscovery.io/templating-guide/#advance-fuzzing) and [workflows](https://nuclei.projectdiscovery.io/templating-guide/#workflows), complex manual steps and repetitive assessment can be easily automated with Nuclei.
|
||||
@ -282,6 +292,8 @@ We have [a discussion thread around this](https://github.com/projectdiscovery/nu
|
||||
|
||||
### Resources
|
||||
|
||||
|
||||
- [Scanning Live Web Applications with Nuclei in CI/CD Pipeline](https://blog.escape.tech/devsecops-part-iii-scanning-live-web-applications/) by [@TristanKalos](https://twitter.com/TristanKalos)
|
||||
- [Community Powered Scanning with Nuclei](https://blog.projectdiscovery.io/community-powered-scanning-with-nuclei/)
|
||||
- [Nuclei Unleashed - Quickly write complex exploits](https://blog.projectdiscovery.io/nuclei-unleashed-quickly-write-complex-exploits/)
|
||||
- [Nuclei - Fuzz all the things](https://blog.projectdiscovery.io/nuclei-fuzz-all-the-things/)
|
||||
|
||||
1036
SYNTAX-REFERENCE.md
1036
SYNTAX-REFERENCE.md
File diff suppressed because it is too large
Load Diff
18
integration_tests/headless/headless-basic.yaml
Normal file
18
integration_tests/headless/headless-basic.yaml
Normal file
@ -0,0 +1,18 @@
|
||||
id: headless-basic
|
||||
info:
|
||||
name: Headless Basic
|
||||
author: pdteam
|
||||
severity: info
|
||||
tags: headless
|
||||
|
||||
headless:
|
||||
- steps:
|
||||
- action: navigate
|
||||
args:
|
||||
url: "{{BaseURL}}/"
|
||||
|
||||
- action: waitload
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- "<html>"
|
||||
31
integration_tests/headless/headless-extract-values.yaml
Normal file
31
integration_tests/headless/headless-extract-values.yaml
Normal file
@ -0,0 +1,31 @@
|
||||
|
||||
id: headless-extract-values
|
||||
info:
|
||||
name: Headless Extract Value
|
||||
author: pdteam
|
||||
severity: info
|
||||
tags: headless
|
||||
|
||||
headless:
|
||||
- steps:
|
||||
- action: navigate
|
||||
args:
|
||||
url: "{{BaseURL}}"
|
||||
- action: waitload
|
||||
# From headless/extract-urls.yaml
|
||||
- action: script
|
||||
name: extract
|
||||
args:
|
||||
code: |
|
||||
'\n' + [...new Set(Array.from(document.querySelectorAll('[src], [href], [url], [action]')).map(i => i.src || i.href || i.url || i.action))].join('\r\n') + '\n'
|
||||
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- "test.html"
|
||||
|
||||
extractors:
|
||||
- type: kval
|
||||
part: extract
|
||||
kval:
|
||||
- extract
|
||||
24
integration_tests/headless/headless-header-action.yaml
Normal file
24
integration_tests/headless/headless-header-action.yaml
Normal file
@ -0,0 +1,24 @@
|
||||
id: headless-header-action
|
||||
info:
|
||||
name: Headless Header Action
|
||||
author: pdteam
|
||||
severity: info
|
||||
tags: headless
|
||||
|
||||
headless:
|
||||
- steps:
|
||||
- action: setheader
|
||||
args:
|
||||
part: request
|
||||
key: Test
|
||||
value: test value
|
||||
|
||||
- action: navigate
|
||||
args:
|
||||
url: "{{BaseURL}}/"
|
||||
|
||||
- action: waitload
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- "test value"
|
||||
23
integration_tests/http/dsl-matcher-variable.yaml
Normal file
23
integration_tests/http/dsl-matcher-variable.yaml
Normal file
@ -0,0 +1,23 @@
|
||||
id: dsl-matcher-variable
|
||||
|
||||
info:
|
||||
name: dsl-matcher-variable
|
||||
author: pd-team
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
-
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
payloads:
|
||||
VALUES:
|
||||
- This
|
||||
- is
|
||||
- test
|
||||
- matcher
|
||||
- text
|
||||
matchers:
|
||||
-
|
||||
dsl:
|
||||
- 'contains(body,"{{VALUES}}")'
|
||||
type: dsl
|
||||
16
integration_tests/http/get-case-insensitive.yaml
Normal file
16
integration_tests/http/get-case-insensitive.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
id: basic-get-case-insensitive
|
||||
|
||||
info:
|
||||
name: Basic GET Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
- method: GET
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
matchers:
|
||||
- type: word
|
||||
case-insensitive: true
|
||||
words:
|
||||
- "ThIS is TEsT MAtcHEr TExT"
|
||||
23
integration_tests/http/get-redirects-chain-headers.yaml
Normal file
23
integration_tests/http/get-redirects-chain-headers.yaml
Normal file
@ -0,0 +1,23 @@
|
||||
id: basic-get-redirects-chain-headers
|
||||
|
||||
info:
|
||||
name: Basic GET Redirects Request With Chain header
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
- method: GET
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
redirects: true
|
||||
max-redirects: 3
|
||||
matchers-condition: and
|
||||
matchers:
|
||||
- type: word
|
||||
part: header
|
||||
words:
|
||||
- "TestRedirectHeaderMatch"
|
||||
|
||||
- type: status
|
||||
status:
|
||||
- 302
|
||||
19
integration_tests/http/interactsh.yaml
Normal file
19
integration_tests/http/interactsh.yaml
Normal file
@ -0,0 +1,19 @@
|
||||
id: interactsh-integration-test
|
||||
|
||||
info:
|
||||
name: Interactsh Integration Test
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
- method: GET
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
headers:
|
||||
url: 'http://{{interactsh-url}}'
|
||||
|
||||
matchers:
|
||||
- type: word
|
||||
part: interactsh_protocol # Confirms the HTTP Interaction
|
||||
words:
|
||||
- "http"
|
||||
@ -7,7 +7,7 @@ info:
|
||||
|
||||
requests:
|
||||
- raw:
|
||||
- |
|
||||
- |+
|
||||
GET / HTTP/1.1
|
||||
Host:
|
||||
Content-Length: 4
|
||||
|
||||
10
integration_tests/loader/basic.yaml
Normal file
10
integration_tests/loader/basic.yaml
Normal file
@ -0,0 +1,10 @@
|
||||
id: workflow-example
|
||||
|
||||
info:
|
||||
name: Test Workflow Template
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
workflows:
|
||||
- template: workflow/match-1.yaml
|
||||
- template: workflow/match-2.yaml
|
||||
11
integration_tests/loader/condition-matched.yaml
Normal file
11
integration_tests/loader/condition-matched.yaml
Normal file
@ -0,0 +1,11 @@
|
||||
id: condition-matched-workflow
|
||||
|
||||
info:
|
||||
name: Condition Matched Workflow
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
workflows:
|
||||
- template: workflow/match-1.yaml
|
||||
subtemplates:
|
||||
- template: workflow/match-2.yaml
|
||||
17
integration_tests/loader/get-headers.yaml
Normal file
17
integration_tests/loader/get-headers.yaml
Normal file
@ -0,0 +1,17 @@
|
||||
id: basic-get-headers
|
||||
|
||||
info:
|
||||
name: Basic GET Headers Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
- method: GET
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
headers:
|
||||
test: nuclei
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- "This is test headers matcher text"
|
||||
15
integration_tests/loader/get.yaml
Normal file
15
integration_tests/loader/get.yaml
Normal file
@ -0,0 +1,15 @@
|
||||
id: basic-get
|
||||
|
||||
info:
|
||||
name: Basic GET Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
requests:
|
||||
- method: GET
|
||||
path:
|
||||
- "{{BaseURL}}"
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- "This is test matcher text"
|
||||
2
integration_tests/loader/template-list.yaml
Normal file
2
integration_tests/loader/template-list.yaml
Normal file
@ -0,0 +1,2 @@
|
||||
loader/get.yaml
|
||||
loader/get-headers.yaml
|
||||
2
integration_tests/loader/workflow-list.yaml
Normal file
2
integration_tests/loader/workflow-list.yaml
Normal file
@ -0,0 +1,2 @@
|
||||
loader/basic.yaml
|
||||
loader/condition-matched.yaml
|
||||
@ -3,35 +3,35 @@ allow-list:
|
||||
deny-list:
|
||||
severity: low
|
||||
|
||||
# github contains configuration options for github issue tracker
|
||||
# GitHub contains configuration options for GitHub issue tracker
|
||||
github:
|
||||
# base-url is the optional self-hosted github application url
|
||||
# base-url is the optional self-hosted GitHub application url
|
||||
base-url: https://localhost:8443/github
|
||||
# username is the username of the github user
|
||||
# username is the username of the GitHub user
|
||||
username: test-username
|
||||
# owner is the owner name of the repository for issues.
|
||||
# owner is the owner name of the repository for issues
|
||||
owner: test-owner
|
||||
# token is the token for github account.
|
||||
# token is the token for GitHub account
|
||||
token: test-token
|
||||
# project-name is the name of the repository.
|
||||
# project-name is the name of the repository
|
||||
project-name: test-project
|
||||
# issue-label is the label of the created issue type
|
||||
issue-label: bug
|
||||
|
||||
# gitlab contains configuration options for gitlab issue tracker
|
||||
# GitLab contains configuration options for gitlab issue tracker
|
||||
gitlab:
|
||||
# base-url is the optional self-hosted gitlab application url
|
||||
# base-url is the optional self-hosted GitLab application url
|
||||
base-url: https://localhost:8443/gitlab
|
||||
# username is the username of the gitlab user
|
||||
# username is the username of the GitLab user
|
||||
username: test-username
|
||||
# token is the token for gitlab account.
|
||||
# token is the token for GitLab account
|
||||
token: test-token
|
||||
# project-id is the ID of the repository.
|
||||
project-id: 1234
|
||||
# project-name is the name/id of the project(repository)
|
||||
project-name: "1234"
|
||||
# issue-label is the label of the created issue type
|
||||
issue-label: bug
|
||||
|
||||
# jira contains configuration options for jira issue tracker
|
||||
# Jira contains configuration options for Jira issue tracker
|
||||
jira:
|
||||
# cloud is the boolean which tells if Jira instance is running in the cloud or on-prem version is used
|
||||
cloud: true
|
||||
@ -39,11 +39,11 @@ jira:
|
||||
update-existing: false
|
||||
# URL is the jira application url
|
||||
url: https://localhost/jira
|
||||
# account-id is the account-id of the jira user or username in case of on-prem Jira
|
||||
# account-id is the account-id of the Jira user or username in case of on-prem Jira
|
||||
account-id: test-account-id
|
||||
# email is the email of the user for jira instance
|
||||
# email is the email of the user for Jira instance
|
||||
email: test@test.com
|
||||
# token is the token for jira instance or password in case of on-prem Jira
|
||||
# token is the token for Jira instance or password in case of on-prem Jira
|
||||
token: test-token
|
||||
# project-name is the name of the project.
|
||||
project-name: test-project-name
|
||||
|
||||
@ -5,47 +5,47 @@ allow-list:
|
||||
deny-list:
|
||||
severity: low
|
||||
|
||||
# github contains configuration options for github issue tracker
|
||||
github:
|
||||
# base-url is the optional self-hosted github application url
|
||||
base-url: https://localhost:8443/github
|
||||
# username is the username of the github user
|
||||
# GitHub contains configuration options for GitHub issue tracker
|
||||
GitHub:
|
||||
# base-url is the optional self-hosted GitHub application url
|
||||
base-url: https://localhost:8443/GitHub
|
||||
# username is the username of the GitHub user
|
||||
username: test-username
|
||||
# owner is the owner name of the repository for issues.
|
||||
owner: test-owner
|
||||
# token is the token for github account.
|
||||
# token is the token for GitHub account.
|
||||
token: test-token
|
||||
# project-name is the name of the repository.
|
||||
project-name: test-project
|
||||
# issue-label is the label of the created issue type
|
||||
issue-label: bug
|
||||
|
||||
# gitlab contains configuration options for gitlab issue tracker
|
||||
gitlab:
|
||||
# base-url is the optional self-hosted gitlab application url
|
||||
base-url: https://localhost:8443/gitlab
|
||||
# username is the username of the gitlab user
|
||||
# GitLab contains configuration options for GitLab issue tracker
|
||||
GitLab:
|
||||
# base-url is the optional self-hosted GitLab application url
|
||||
base-url: https://localhost:8443/GitLab
|
||||
# username is the username of the GitLab user
|
||||
username: test-username
|
||||
# token is the token for gitlab account.
|
||||
# token is the token for GitLab account.
|
||||
token: test-token
|
||||
# project-id is the ID of the repository.
|
||||
project-id: 1234
|
||||
# project-name is the name/id of the project(repository).
|
||||
project-name: "1234"
|
||||
# issue-label is the label of the created issue type
|
||||
issue-label: bug
|
||||
|
||||
# jira contains configuration options for jira issue tracker
|
||||
jira:
|
||||
# Jira contains configuration options for Jira issue tracker
|
||||
Jira:
|
||||
# cloud is the boolean which tells if Jira instance is running in the cloud or on-prem version is used
|
||||
cloud: true
|
||||
# update-existing is the boolean which tells if the existing, opened issue should be updated or new one should be created
|
||||
update-existing: false
|
||||
# URL is the jira application url
|
||||
url: https://localhost/jira
|
||||
# account-id is the account-id of the jira user or username in case of on-prem Jira
|
||||
# URL is the Jira application url
|
||||
url: https://localhost/Jira
|
||||
# account-id is the account-id of the Jira user or username in case of on-prem Jira
|
||||
account-id: test-account-id
|
||||
# email is the email of the user for jira instance
|
||||
# email is the email of the user for Jira instance
|
||||
email: test@test.com
|
||||
# token is the token for jira instance or password in case of on-prem Jira
|
||||
# token is the token for Jira instance or password in case of on-prem Jira
|
||||
token: test-token
|
||||
# project-name is the name of the project.
|
||||
project-name: test-project-name
|
||||
|
||||
16
integration_tests/websocket/basic.yaml
Normal file
16
integration_tests/websocket/basic.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
id: basic-request
|
||||
|
||||
info:
|
||||
name: Basic Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
websocket:
|
||||
- address: '{{Scheme}}://{{Hostname}}'
|
||||
inputs:
|
||||
- data: hello
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- world
|
||||
part: response
|
||||
16
integration_tests/websocket/cswsh.yaml
Normal file
16
integration_tests/websocket/cswsh.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
id: basic-cswsh-request
|
||||
|
||||
info:
|
||||
name: Basic cswsh Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
websocket:
|
||||
- address: '{{Scheme}}://{{Hostname}}'
|
||||
headers:
|
||||
Origin: 'http://evil.com'
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- true
|
||||
part: success
|
||||
16
integration_tests/websocket/no-cswsh.yaml
Normal file
16
integration_tests/websocket/no-cswsh.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
id: basic-nocswsh-request
|
||||
|
||||
info:
|
||||
name: Basic Non-Vulnerable cswsh Request
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
websocket:
|
||||
- address: '{{Scheme}}://{{Hostname}}'
|
||||
headers:
|
||||
Origin: 'http://evil.com'
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- true
|
||||
part: success
|
||||
16
integration_tests/websocket/path.yaml
Normal file
16
integration_tests/websocket/path.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
id: basic-request-path
|
||||
|
||||
info:
|
||||
name: Basic Request Path
|
||||
author: pdteam
|
||||
severity: info
|
||||
|
||||
websocket:
|
||||
- address: '{{Scheme}}://{{Hostname}}'
|
||||
inputs:
|
||||
- data: hello
|
||||
matchers:
|
||||
- type: word
|
||||
words:
|
||||
- world
|
||||
part: response
|
||||
@ -130,15 +130,8 @@
|
||||
"description": "Name of the extractor"
|
||||
},
|
||||
"type": {
|
||||
"enum": [
|
||||
"regex",
|
||||
"kval",
|
||||
"json",
|
||||
"xpath"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type of the extractor",
|
||||
"description": "Type of the extractor"
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/extractors.ExtractorTypeHolder"
|
||||
},
|
||||
"regex": {
|
||||
"items": {
|
||||
@ -191,26 +184,35 @@
|
||||
"type": "boolean",
|
||||
"title": "mark extracted value for internal variable use",
|
||||
"description": "Internal when set to true will allow using the value extracted in the next request for some protocols"
|
||||
},
|
||||
"case-insensitive": {
|
||||
"type": "boolean",
|
||||
"title": "use case insensitive extract",
|
||||
"description": "use case insensitive extract"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"extractors.ExtractorTypeHolder": {
|
||||
"enum": [
|
||||
"regex",
|
||||
"kval",
|
||||
"xpath",
|
||||
"json"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type of the extractor",
|
||||
"description": "Type of the extractor"
|
||||
},
|
||||
"matchers.Matcher": {
|
||||
"required": [
|
||||
"type"
|
||||
],
|
||||
"properties": {
|
||||
"type": {
|
||||
"enum": [
|
||||
"status",
|
||||
"size",
|
||||
"word",
|
||||
"regex",
|
||||
"binary",
|
||||
"dsl"
|
||||
],
|
||||
"type": "string",
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/matchers.MatcherTypeHolder",
|
||||
"title": "type of matcher",
|
||||
"description": "Type of the matcher"
|
||||
},
|
||||
@ -293,11 +295,55 @@
|
||||
"type": "string",
|
||||
"title": "encoding for word field",
|
||||
"description": "Optional encoding for the word fields"
|
||||
},
|
||||
"case-insensitive": {
|
||||
"type": "boolean",
|
||||
"title": "use case insensitive match",
|
||||
"description": "use case insensitive match"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"matchers.MatcherTypeHolder": {
|
||||
"enum": [
|
||||
"word",
|
||||
"regex",
|
||||
"binary",
|
||||
"status",
|
||||
"size",
|
||||
"dsl"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type of the matcher",
|
||||
"description": "Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"
|
||||
},
|
||||
"generators.AttackTypeHolder": {
|
||||
"enum": [
|
||||
"batteringram",
|
||||
"pitchfork",
|
||||
"clusterbomb"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type of the attack",
|
||||
"description": "Type of the attack"
|
||||
},
|
||||
"dns.DNSRequestTypeHolder": {
|
||||
"enum": [
|
||||
"A",
|
||||
"NS",
|
||||
"DS",
|
||||
"CNAME",
|
||||
"SOA",
|
||||
"PTR",
|
||||
"MX",
|
||||
"TXT",
|
||||
"AAAA"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type of DNS request to make",
|
||||
"description": "Type is the type of DNS request to make,enum=A,enum=NS,enum=DS,enum=CNAME,enum=SOA,enum=PTR,enum=MX,enum=TXT,enum=AAAA"
|
||||
},
|
||||
"dns.Request": {
|
||||
"properties": {
|
||||
"matchers": {
|
||||
@ -336,18 +382,8 @@
|
||||
"description": "Name is the Hostname to make DNS request for"
|
||||
},
|
||||
"type": {
|
||||
"enum": [
|
||||
"A",
|
||||
"NS",
|
||||
"DS",
|
||||
"CNAME",
|
||||
"SOA",
|
||||
"PTR",
|
||||
"MX",
|
||||
"TXT",
|
||||
"AAAA"
|
||||
],
|
||||
"type": "string",
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/dns.DNSRequestTypeHolder",
|
||||
"title": "type of dns request to make",
|
||||
"description": "Type is the type of DNS request to make"
|
||||
},
|
||||
@ -369,6 +405,16 @@
|
||||
"title": "retries for dns request",
|
||||
"description": "Retries is the number of retries for the DNS request"
|
||||
},
|
||||
"trace": {
|
||||
"type": "boolean",
|
||||
"title": "trace operation",
|
||||
"description": "Trace performs a trace operation for the target."
|
||||
},
|
||||
"trace-max-recursion": {
|
||||
"type": "integer",
|
||||
"title": "trace-max-recursion level for dns request",
|
||||
"description": "TraceMaxRecursion is the number of max recursion allowed for trace operations"
|
||||
},
|
||||
"recursion": {
|
||||
"type": "boolean",
|
||||
"title": "recurse all servers",
|
||||
@ -519,6 +565,16 @@
|
||||
"description": "Description of the headless action"
|
||||
},
|
||||
"action": {
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/engine.ActionTypeHolder",
|
||||
"title": "action to perform",
|
||||
"description": "Type of actions to perform"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"engine.ActionTypeHolder": {
|
||||
"enum": [
|
||||
"navigate",
|
||||
"script",
|
||||
@ -532,7 +588,7 @@
|
||||
"waitload",
|
||||
"getresource",
|
||||
"extract",
|
||||
"setmethod",
|
||||
"set-method",
|
||||
"addheader",
|
||||
"setheader",
|
||||
"deleteheader",
|
||||
@ -540,15 +596,29 @@
|
||||
"waitevent",
|
||||
"keyboard",
|
||||
"debug",
|
||||
"sleep"
|
||||
"sleep",
|
||||
"waitvisible"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "action to perform",
|
||||
"description": "Type of actions to perform"
|
||||
}
|
||||
"description": "Type of actions to perform,enum=navigate,enum=script,enum=click,enum=rightclick,enum=text,enum=screenshot,enum=time,enum=select,enum=files,enum=waitload,enum=getresource,enum=extract,enum=setmethod,enum=addheader,enum=setheader,enum=deleteheader,enum=setbody,enum=waitevent,enum=keyboard,enum=debug,enum=sleep"
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
"http.HTTPMethodTypeHolder": {
|
||||
"enum": [
|
||||
"GET",
|
||||
"HEAD",
|
||||
"POST",
|
||||
"PUT",
|
||||
"DELETE",
|
||||
"CONNECT",
|
||||
"OPTIONS",
|
||||
"TRACE",
|
||||
"PATCH",
|
||||
"PURGE"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "method is the HTTP request method",
|
||||
"description": "Method is the HTTP Request Method,enum=GET,enum=HEAD,enum=POST,enum=PUT,enum=DELETE,enum=CONNECT,enum=OPTIONS,enum=TRACE,enum=PATCH,enum=PURGE"
|
||||
},
|
||||
"http.Request": {
|
||||
"properties": {
|
||||
@ -605,29 +675,14 @@
|
||||
"description": "Optional name for the HTTP Request"
|
||||
},
|
||||
"attack": {
|
||||
"enum": [
|
||||
"batteringram",
|
||||
"pitchfork",
|
||||
"clusterbomb"
|
||||
],
|
||||
"type": "string",
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/generators.AttackTypeHolder",
|
||||
"title": "attack is the payload combination",
|
||||
"description": "Attack is the type of payload combinations to perform"
|
||||
},
|
||||
"method": {
|
||||
"enum": [
|
||||
"GET",
|
||||
"HEAD",
|
||||
"POST",
|
||||
"PUT",
|
||||
"DELETE",
|
||||
"CONNECT",
|
||||
"OPTIONS",
|
||||
"TRACE",
|
||||
"PATCH",
|
||||
"PURGE"
|
||||
],
|
||||
"type": "string",
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/http.HTTPMethodTypeHolder",
|
||||
"title": "method is the http request method",
|
||||
"description": "Method is the HTTP Request Method"
|
||||
},
|
||||
@ -725,6 +780,11 @@
|
||||
"type": "boolean",
|
||||
"title": "skip variable checks",
|
||||
"description": "Skips the check for unresolved variables in request"
|
||||
},
|
||||
"iterate-all": {
|
||||
"type": "boolean",
|
||||
"title": "iterate all the values",
|
||||
"description": "Iterates all the values extracted from internal extractors"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
@ -738,11 +798,8 @@
|
||||
"description": "Data is the data to send as the input"
|
||||
},
|
||||
"type": {
|
||||
"enum": [
|
||||
"hex",
|
||||
"text"
|
||||
],
|
||||
"type": "string",
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/network.NetworkInputTypeHolder",
|
||||
"title": "type is the type of input data",
|
||||
"description": "Type of input specified in data field"
|
||||
},
|
||||
@ -760,6 +817,15 @@
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"network.NetworkInputTypeHolder": {
|
||||
"enum": [
|
||||
"hex",
|
||||
"text"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "type is the type of input data",
|
||||
"description": "description=Type of input specified in data field,enum=hex,enum=text"
|
||||
},
|
||||
"network.Request": {
|
||||
"properties": {
|
||||
"id": {
|
||||
@ -776,12 +842,7 @@
|
||||
"description": "Host to send network requests to"
|
||||
},
|
||||
"attack": {
|
||||
"enum": [
|
||||
"batteringram",
|
||||
"pitchfork",
|
||||
"clusterbomb"
|
||||
],
|
||||
"type": "string",
|
||||
"$ref": "#/definitions/generators.AttackTypeHolder",
|
||||
"title": "attack is the payload combination",
|
||||
"description": "Attack is the type of payload combinations to perform"
|
||||
},
|
||||
@ -809,6 +870,11 @@
|
||||
"title": "size of network response to read",
|
||||
"description": "Size of response to read at the end. Default is 1024 bytes"
|
||||
},
|
||||
"read-all": {
|
||||
"type": "boolean",
|
||||
"title": "read all response stream",
|
||||
"description": "Read all response stream till the server stops sending"
|
||||
},
|
||||
"matchers": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/matchers.Matcher"
|
||||
@ -838,6 +904,128 @@
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"ssl.Request": {
|
||||
"properties": {
|
||||
"matchers": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/matchers.Matcher"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "matchers to run on response",
|
||||
"description": "Detection mechanism to identify whether the request was successful by doing pattern matching"
|
||||
},
|
||||
"extractors": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/extractors.Extractor"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "extractors to run on response",
|
||||
"description": "Extractors contains the extraction mechanism for the request to identify and extract parts of the response"
|
||||
},
|
||||
"matchers-condition": {
|
||||
"enum": [
|
||||
"and",
|
||||
"or"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "condition between the matchers",
|
||||
"description": "Conditions between the matchers"
|
||||
},
|
||||
"address": {
|
||||
"type": "string",
|
||||
"title": "address for the ssl request",
|
||||
"description": "Address contains address for the request"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"websocket.Input": {
|
||||
"properties": {
|
||||
"data": {
|
||||
"type": "string",
|
||||
"title": "data to send as input",
|
||||
"description": "Data is the data to send as the input"
|
||||
},
|
||||
"name": {
|
||||
"type": "string",
|
||||
"title": "optional name for data read",
|
||||
"description": "Optional name of the data read to provide matching on"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"websocket.Request": {
|
||||
"properties": {
|
||||
"matchers": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/matchers.Matcher"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "matchers to run on response",
|
||||
"description": "Detection mechanism to identify whether the request was successful by doing pattern matching"
|
||||
},
|
||||
"extractors": {
|
||||
"items": {
|
||||
"$ref": "#/definitions/extractors.Extractor"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "extractors to run on response",
|
||||
"description": "Extractors contains the extraction mechanism for the request to identify and extract parts of the response"
|
||||
},
|
||||
"matchers-condition": {
|
||||
"enum": [
|
||||
"and",
|
||||
"or"
|
||||
],
|
||||
"type": "string",
|
||||
"title": "condition between the matchers",
|
||||
"description": "Conditions between the matchers"
|
||||
},
|
||||
"address": {
|
||||
"type": "string",
|
||||
"title": "address for the websocket request",
|
||||
"description": "Address contains address for the request"
|
||||
},
|
||||
"inputs": {
|
||||
"items": {
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/websocket.Input"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "inputs for the websocket request",
|
||||
"description": "Inputs contains any input/output for the current request"
|
||||
},
|
||||
"headers": {
|
||||
"patternProperties": {
|
||||
".*": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object",
|
||||
"title": "headers contains the request headers",
|
||||
"description": "Headers contains headers for the request"
|
||||
},
|
||||
"attack": {
|
||||
"$ref": "#/definitions/generators.AttackTypeHolder",
|
||||
"title": "attack is the payload combination",
|
||||
"description": "Attack is the type of payload combinations to perform"
|
||||
},
|
||||
"payloads": {
|
||||
"patternProperties": {
|
||||
".*": {
|
||||
"additionalProperties": true
|
||||
}
|
||||
},
|
||||
"type": "object",
|
||||
"title": "payloads for the webosocket request",
|
||||
"description": "Payloads contains any payloads for the current request"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"type": "object"
|
||||
},
|
||||
"templates.Template": {
|
||||
"required": [
|
||||
"id",
|
||||
@ -845,6 +1033,7 @@
|
||||
],
|
||||
"properties": {
|
||||
"id": {
|
||||
"pattern": "^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$",
|
||||
"type": "string",
|
||||
"title": "id of the template",
|
||||
"description": "The Unique ID for the template",
|
||||
@ -903,6 +1092,24 @@
|
||||
"title": "headless requests to make",
|
||||
"description": "Headless requests to make for the template"
|
||||
},
|
||||
"ssl": {
|
||||
"items": {
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/ssl.Request"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "ssl requests to make",
|
||||
"description": "SSL requests to make for the template"
|
||||
},
|
||||
"websocket": {
|
||||
"items": {
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
"$ref": "#/definitions/websocket.Request"
|
||||
},
|
||||
"type": "array",
|
||||
"title": "websocket requests to make",
|
||||
"description": "Websocket requests to make for the template"
|
||||
},
|
||||
"workflows": {
|
||||
"items": {
|
||||
"$schema": "http://json-schema.org/draft-04/schema#",
|
||||
@ -916,6 +1123,11 @@
|
||||
"type": "boolean",
|
||||
"title": "mark requests as self-contained",
|
||||
"description": "Mark Requests for the template as self-contained"
|
||||
},
|
||||
"stop-at-first-match": {
|
||||
"type": "boolean",
|
||||
"title": "stop at first match",
|
||||
"description": "Stop at first match for the template"
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
|
||||
@ -18,5 +18,9 @@ docs:
|
||||
./cmd/docgen/docgen docs.md nuclei-jsonschema.json
|
||||
test:
|
||||
$(GOTEST) -v ./...
|
||||
integration:
|
||||
bash ../integration_tests/run.sh
|
||||
functional:
|
||||
bash cmd/functional-tests/run.sh
|
||||
tidy:
|
||||
$(GOMOD) tidy
|
||||
@ -10,6 +10,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"github.com/Ice3man543/nvd"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
|
||||
)
|
||||
|
||||
@ -116,26 +117,26 @@ func getCVEData(client *nvd.Client, filePath, data string) {
|
||||
}
|
||||
if !strings.Contains(infoBlockClean, "classification") && (cvssScore != 0 && cvssMetrics != "") {
|
||||
changed = true
|
||||
newInfoBlock = newInfoBlock + fmt.Sprintf("\n classification:\n cvss-metrics: %s\n cvss-score: %.2f\n cve-id: %s", cvssMetrics, cvssScore, cveName)
|
||||
newInfoBlock += fmt.Sprintf("\n classification:\n cvss-metrics: %s\n cvss-score: %.2f\n cve-id: %s", cvssMetrics, cvssScore, cveName)
|
||||
if len(cweID) > 0 && (cweID[0] != "NVD-CWE-Other" && cweID[0] != "NVD-CWE-noinfo") {
|
||||
newInfoBlock = newInfoBlock + fmt.Sprintf("\n cwe-id: %s", strings.Join(cweID, ","))
|
||||
newInfoBlock += fmt.Sprintf("\n cwe-id: %s", strings.Join(cweID, ","))
|
||||
}
|
||||
}
|
||||
// If there is no description field, fill the description from CVE information
|
||||
if !strings.Contains(infoBlockClean, "description:") && len(cveItem.CVE.Description.DescriptionData) > 0 {
|
||||
changed = true
|
||||
newInfoBlock = newInfoBlock + fmt.Sprintf("\n description: %s", fmt.Sprintf("%q", cveItem.CVE.Description.DescriptionData[0].Value))
|
||||
newInfoBlock += fmt.Sprintf("\n description: %s", fmt.Sprintf("%q", cveItem.CVE.Description.DescriptionData[0].Value))
|
||||
}
|
||||
if !strings.Contains(infoBlockClean, "reference:") && len(cveItem.CVE.References.ReferenceData) > 0 {
|
||||
changed = true
|
||||
newInfoBlock = newInfoBlock + "\n reference:"
|
||||
newInfoBlock += "\n reference:"
|
||||
for _, reference := range cveItem.CVE.References.ReferenceData {
|
||||
newInfoBlock = newInfoBlock + fmt.Sprintf("\n - %s", reference.URL)
|
||||
newInfoBlock += fmt.Sprintf("\n - %s", reference.URL)
|
||||
}
|
||||
}
|
||||
newTemplate := strings.ReplaceAll(data, infoBlockClean, newInfoBlock)
|
||||
if changed {
|
||||
_ = ioutil.WriteFile(filePath, []byte(newTemplate), 0777)
|
||||
_ = ioutil.WriteFile(filePath, []byte(newTemplate), 0644)
|
||||
fmt.Printf("Wrote updated template to %s\n", filePath)
|
||||
}
|
||||
}
|
||||
|
||||
@ -10,10 +10,11 @@ import (
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
)
|
||||
|
||||
var pathRegex = regexp.MustCompile(`github.com/projectdiscovery/nuclei/v2/(?:internal|pkg)/(?:.*/)?([A-Za-z\.]+)`)
|
||||
var pathRegex = regexp.MustCompile(`github\.com/projectdiscovery/nuclei/v2/(?:internal|pkg)/(?:.*/)?([A-Za-z.]+)`)
|
||||
|
||||
func main() {
|
||||
// Generate yaml syntax documentation
|
||||
@ -21,7 +22,7 @@ func main() {
|
||||
if err != nil {
|
||||
log.Fatalf("Could not encode docs: %s\n", err)
|
||||
}
|
||||
err = ioutil.WriteFile(os.Args[1], data, 0777)
|
||||
err = ioutil.WriteFile(os.Args[1], data, 0644)
|
||||
if err != nil {
|
||||
log.Fatalf("Could not write docs: %s\n", err)
|
||||
}
|
||||
@ -43,7 +44,7 @@ func main() {
|
||||
for _, match := range pathRegex.FindAllStringSubmatch(schema, -1) {
|
||||
schema = strings.ReplaceAll(schema, match[0], match[1])
|
||||
}
|
||||
err = ioutil.WriteFile(os.Args[2], []byte(schema), 0777)
|
||||
err = ioutil.WriteFile(os.Args[2], []byte(schema), 0644)
|
||||
if err != nil {
|
||||
log.Fatalf("Could not write jsonschema: %s\n", err)
|
||||
}
|
||||
|
||||
@ -11,7 +11,7 @@ import (
|
||||
"github.com/logrusorgru/aurora"
|
||||
"github.com/pkg/errors"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var (
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var dnsTestCases = map[string]testutils.TestCase{
|
||||
|
||||
81
v2/cmd/integration-test/headless.go
Normal file
81
v2/cmd/integration-test/headless.go
Normal file
@ -0,0 +1,81 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
|
||||
"github.com/julienschmidt/httprouter"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var headlessTestcases = map[string]testutils.TestCase{
|
||||
"headless/headless-basic.yaml": &headlessBasic{},
|
||||
"headless/headless-header-action.yaml": &headlessHeaderActions{},
|
||||
"headless/headless-extract-values.yaml": &headlessExtractValues{},
|
||||
}
|
||||
|
||||
type headlessBasic struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *headlessBasic) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
_, _ = w.Write([]byte("<html><body></body></html>"))
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug, "-headless")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type headlessHeaderActions struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *headlessHeaderActions) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
testValue := r.Header.Get("test")
|
||||
if r.Header.Get("test") != "" {
|
||||
_, _ = w.Write([]byte("<html><body>" + testValue + "</body></html>"))
|
||||
}
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug, "-headless")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type headlessExtractValues struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *headlessExtractValues) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
_, _ = w.Write([]byte("<html><body><a href='/test.html'>test</a></body></html>"))
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug, "-headless")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 3 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
@ -11,7 +11,7 @@ import (
|
||||
|
||||
"github.com/julienschmidt/httprouter"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var httpTestcases = map[string]testutils.TestCase{
|
||||
@ -31,7 +31,38 @@ var httpTestcases = map[string]testutils.TestCase{
|
||||
"http/raw-unsafe-request.yaml": &httpRawUnsafeRequest{},
|
||||
"http/request-condition.yaml": &httpRequestCondition{},
|
||||
"http/request-condition-new.yaml": &httpRequestCondition{},
|
||||
"http/interactsh.yaml": &httpInteractshRequest{},
|
||||
"http/self-contained.yaml": &httpRequestSelContained{},
|
||||
"http/get-case-insensitive.yaml": &httpGetCaseInsensitive{},
|
||||
"http/get.yaml,http/get-case-insensitive.yaml": &httpGetCaseInsensitiveCluster{},
|
||||
"http/get-redirects-chain-headers.yaml": &httpGetRedirectsChainHeaders{},
|
||||
"http/dsl-matcher-variable.yaml": &httpDSLVariable{},
|
||||
}
|
||||
|
||||
type httpInteractshRequest struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *httpInteractshRequest) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
value := r.Header.Get("url")
|
||||
if value != "" {
|
||||
if resp, _ := http.DefaultClient.Get(value); resp != nil {
|
||||
resp.Body.Close()
|
||||
}
|
||||
}
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpGetHeaders struct{}
|
||||
@ -125,6 +156,27 @@ func (h *httpGet) Execute(filePath string) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpDSLVariable struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *httpDSLVariable) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
fmt.Fprintf(w, "This is test matcher text")
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 5 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpPostBody struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
@ -526,3 +578,75 @@ func (h *httpRequestSelContained) Execute(filePath string) error {
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpGetCaseInsensitive struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *httpGetCaseInsensitive) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
fmt.Fprintf(w, "THIS IS TEST MATCHER TEXT")
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpGetCaseInsensitiveCluster struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *httpGetCaseInsensitiveCluster) Execute(filesPath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
fmt.Fprintf(w, "This is test matcher text")
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
files := strings.Split(filesPath, ",")
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(files[0], ts.URL, debug, "-t", files[1])
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 2 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type httpGetRedirectsChainHeaders struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *httpGetRedirectsChainHeaders) Execute(filePath string) error {
|
||||
router := httprouter.New()
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
http.Redirect(w, r, "/redirected", http.StatusFound)
|
||||
})
|
||||
router.GET("/redirected", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
w.Header().Set("Secret", "TestRedirectHeaderMatch")
|
||||
http.Redirect(w, r, "/final", http.StatusFound)
|
||||
})
|
||||
router.GET("/final", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
_, _ = w.Write([]byte("ok"))
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -6,7 +6,8 @@ import (
|
||||
"strings"
|
||||
|
||||
"github.com/logrusorgru/aurora"
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var (
|
||||
@ -26,6 +27,9 @@ func main() {
|
||||
"network": networkTestcases,
|
||||
"dns": dnsTestCases,
|
||||
"workflow": workflowTestcases,
|
||||
"loader": loaderTestcases,
|
||||
"websocket": websocketTestCases,
|
||||
"headless": headlessTestcases,
|
||||
}
|
||||
for proto, tests := range protocolTests {
|
||||
if protocol == "" || protocol == proto {
|
||||
@ -50,5 +54,5 @@ func main() {
|
||||
}
|
||||
|
||||
func errIncorrectResultsCount(results []string) error {
|
||||
return fmt.Errorf("incorrect number of results %s", strings.Join(results, "\n\t"))
|
||||
return fmt.Errorf("incorrect number of results \n\t%s", strings.Join(results, "\n\t"))
|
||||
}
|
||||
|
||||
124
v2/cmd/integration-test/loader.go
Normal file
124
v2/cmd/integration-test/loader.go
Normal file
@ -0,0 +1,124 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/julienschmidt/httprouter"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var loaderTestcases = map[string]testutils.TestCase{
|
||||
"loader/template-list.yaml": &remoteTemplateList{},
|
||||
"loader/workflow-list.yaml": &remoteWorkflowList{},
|
||||
"loader/nonexistent-template-list.yaml": &nonExistentTemplateList{},
|
||||
"loader/nonexistent-workflow-list.yaml": &nonExistentWorkflowList{},
|
||||
}
|
||||
|
||||
type remoteTemplateList struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *remoteTemplateList) Execute(templateList string) error {
|
||||
router := httprouter.New()
|
||||
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
fmt.Fprintf(w, "This is test matcher text")
|
||||
if strings.EqualFold(r.Header.Get("test"), "nuclei") {
|
||||
fmt.Fprintf(w, "This is test headers matcher text")
|
||||
}
|
||||
})
|
||||
|
||||
router.GET("/template_list", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
file, err := os.ReadFile(templateList)
|
||||
if err != nil {
|
||||
w.WriteHeader(500)
|
||||
}
|
||||
_, err = w.Write(file)
|
||||
if err != nil {
|
||||
w.WriteHeader(500)
|
||||
}
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-tu", ts.URL+"/template_list")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 2 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type remoteWorkflowList struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *remoteWorkflowList) Execute(workflowList string) error {
|
||||
router := httprouter.New()
|
||||
|
||||
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
fmt.Fprintf(w, "This is test matcher text")
|
||||
if strings.EqualFold(r.Header.Get("test"), "nuclei") {
|
||||
fmt.Fprintf(w, "This is test headers matcher text")
|
||||
}
|
||||
})
|
||||
|
||||
router.GET("/workflow_list", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
|
||||
file, err := os.ReadFile(workflowList)
|
||||
if err != nil {
|
||||
w.WriteHeader(500)
|
||||
}
|
||||
_, err = w.Write(file)
|
||||
if err != nil {
|
||||
w.WriteHeader(500)
|
||||
}
|
||||
})
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-wu", ts.URL+"/workflow_list")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 3 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type nonExistentTemplateList struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *nonExistentTemplateList) Execute(nonExistingTemplateList string) error {
|
||||
router := httprouter.New()
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
_, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-tu", ts.URL+"/404")
|
||||
if err == nil {
|
||||
return fmt.Errorf("expected error for nonexisting workflow url")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
type nonExistentWorkflowList struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *nonExistentWorkflowList) Execute(nonExistingWorkflowList string) error {
|
||||
router := httprouter.New()
|
||||
ts := httptest.NewServer(router)
|
||||
defer ts.Close()
|
||||
|
||||
_, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-wu", ts.URL+"/404")
|
||||
if err == nil {
|
||||
return fmt.Errorf("expected error for nonexisting workflow url")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
@ -3,7 +3,7 @@ package main
|
||||
import (
|
||||
"net"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var networkTestcases = map[string]testutils.TestCase{
|
||||
|
||||
115
v2/cmd/integration-test/websocket.go
Normal file
115
v2/cmd/integration-test/websocket.go
Normal file
@ -0,0 +1,115 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"net"
|
||||
"strings"
|
||||
|
||||
"github.com/gobwas/ws/wsutil"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var websocketTestCases = map[string]testutils.TestCase{
|
||||
"websocket/basic.yaml": &websocketBasic{},
|
||||
"websocket/cswsh.yaml": &websocketCswsh{},
|
||||
"websocket/no-cswsh.yaml": &websocketNoCswsh{},
|
||||
"websocket/path.yaml": &websocketWithPath{},
|
||||
}
|
||||
|
||||
type websocketBasic struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *websocketBasic) Execute(filePath string) error {
|
||||
connHandler := func(conn net.Conn) {
|
||||
for {
|
||||
msg, op, _ := wsutil.ReadClientData(conn)
|
||||
if string(msg) != "hello" {
|
||||
return
|
||||
}
|
||||
_ = wsutil.WriteServerMessage(conn, op, []byte("world"))
|
||||
}
|
||||
}
|
||||
originValidate := func(origin string) bool {
|
||||
return true
|
||||
}
|
||||
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type websocketCswsh struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *websocketCswsh) Execute(filePath string) error {
|
||||
connHandler := func(conn net.Conn) {
|
||||
|
||||
}
|
||||
originValidate := func(origin string) bool {
|
||||
return true
|
||||
}
|
||||
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 1 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type websocketNoCswsh struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *websocketNoCswsh) Execute(filePath string) error {
|
||||
connHandler := func(conn net.Conn) {
|
||||
|
||||
}
|
||||
originValidate := func(origin string) bool {
|
||||
return origin == "https://google.com"
|
||||
}
|
||||
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 0 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
type websocketWithPath struct{}
|
||||
|
||||
// Execute executes a test case and returns an error if occurred
|
||||
func (h *websocketWithPath) Execute(filePath string) error {
|
||||
connHandler := func(conn net.Conn) {
|
||||
|
||||
}
|
||||
originValidate := func(origin string) bool {
|
||||
return origin == "https://google.com"
|
||||
}
|
||||
ts := testutils.NewWebsocketServer("/test", connHandler, originValidate)
|
||||
defer ts.Close()
|
||||
|
||||
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if len(results) != 0 {
|
||||
return errIncorrectResultsCount(results)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
@ -7,7 +7,7 @@ import (
|
||||
|
||||
"github.com/julienschmidt/httprouter"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
var workflowTestcases = map[string]testutils.TestCase{
|
||||
|
||||
@ -5,51 +5,51 @@
|
||||
#deny-list:
|
||||
# severity: info, low, medium
|
||||
|
||||
# github contains configuration options for github issue tracker
|
||||
#github:
|
||||
# # base-url (optional) is the self-hosted github application url
|
||||
# GitHub contains configuration options for GitHub issue tracker
|
||||
#GitHub:
|
||||
# # base-url (optional) is the self-hosted GitHub application url
|
||||
# base-url: ""
|
||||
# # username is the username of the github user
|
||||
# # username is the username of the GitHub user
|
||||
# username: ""
|
||||
# # owner is the owner name of the repository for issues.
|
||||
# owner: ""
|
||||
# # token is the token for github account.
|
||||
# # token is the token for GitHub account.
|
||||
# token: ""
|
||||
# # project-name is the name of the repository.
|
||||
# project-name: ""
|
||||
# # issue-label (optional) is the label of the created issue type
|
||||
# issue-label: ""
|
||||
# # severity-as-label (optional) sets the sevetiry as the label of the created issue type
|
||||
# # severity-as-label (optional) sets the severity as the label of the created issue type
|
||||
# severity-as-label: false
|
||||
|
||||
# gitlab contains configuration options for gitlab issue tracker
|
||||
#gitlab:
|
||||
# # base-url (optional) is the self-hosted gitlab application url
|
||||
# GitLab contains configuration options for GitLab issue tracker
|
||||
#GitLab:
|
||||
# # base-url (optional) is the self-hosted GitLab application url
|
||||
# base-url: ""
|
||||
# # username is the username of the gitlab user
|
||||
# # username is the username of the GitLab user
|
||||
# username: ""
|
||||
# # token is the token for gitlab account.
|
||||
# # token is the token for GitLab account.
|
||||
# token: ""
|
||||
# # project-id is the ID of the repository.
|
||||
# project-id: ""
|
||||
# # issue-label (optional) is the label of the created issue type
|
||||
# issue-label: ""
|
||||
# # severity-as-label (optional) sets the sevetiry as the label of the created issue type
|
||||
# # severity-as-label (optional) sets the severity as the label of the created issue type
|
||||
# severity-as-label: false
|
||||
|
||||
# jira contains configuration options for jira issue tracker
|
||||
#jira:
|
||||
# Jira contains configuration options for Jira issue tracker
|
||||
#Jira:
|
||||
# # cloud (optional) is the boolean which tells if Jira instance is running in the cloud or on-prem version is used
|
||||
# cloud: true
|
||||
# # update-existing (optional) is the boolean which tells if the existing, opened issue should be updated or new one should be created
|
||||
# update-existing: false
|
||||
# # URL is the jira application url
|
||||
# # URL is the Jira application URL
|
||||
# url: ""
|
||||
# # account-id is the account-id of the jira user or username in case of on-prem Jira
|
||||
# # account-id is the account-id of the Jira user or username in case of on-prem Jira
|
||||
# account-id: ""
|
||||
# # email is the email of the user for jira instance
|
||||
# # email is the email of the user for Jira instance
|
||||
# email: ""
|
||||
# # token is the token for jira instance or password in case of on-prem Jira
|
||||
# # token is the token for Jira instance or password in case of on-prem Jira
|
||||
# token: ""
|
||||
# # project-name is the name of the project.
|
||||
# project-name: ""
|
||||
|
||||
@ -9,6 +9,7 @@ import (
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/runner"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
|
||||
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
@ -54,8 +55,10 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
|
||||
createGroup(flagSet, "templates", "Templates",
|
||||
flagSet.StringSliceVarP(&options.Templates, "templates", "t", []string{}, "template or template directory paths to include in the scan"),
|
||||
flagSet.StringSliceVarP(&options.TemplateURLs, "template-url", "tu", []string{}, "URL containing list of templates to run"),
|
||||
flagSet.BoolVarP(&options.NewTemplates, "new-templates", "nt", false, "run only new templates added in latest nuclei-templates release"),
|
||||
flagSet.StringSliceVarP(&options.Workflows, "workflows", "w", []string{}, "workflow or workflow directory paths to include in the scan"),
|
||||
flagSet.StringSliceVarP(&options.WorkflowURLs, "workflow-url", "wu", []string{}, "URL containing list of workflows to run"),
|
||||
flagSet.BoolVar(&options.Validate, "validate", false, "validate the passed templates to nuclei"),
|
||||
flagSet.BoolVar(&options.TemplateList, "tl", false, "list all available templates"),
|
||||
)
|
||||
@ -68,7 +71,9 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.StringSliceVarP(&options.ExcludedTemplates, "exclude-templates", "et", []string{}, "template or template directory paths to exclude"),
|
||||
flagSet.VarP(&options.Severities, "severity", "s", fmt.Sprintf("Templates to run based on severity. Possible values: %s", severity.GetSupportedSeverities().String())),
|
||||
flagSet.VarP(&options.ExcludeSeverities, "exclude-severity", "es", fmt.Sprintf("Templates to exclude based on severity. Possible values: %s", severity.GetSupportedSeverities().String())),
|
||||
flagSet.NormalizedStringSliceVarP(&options.Author, "author", "a", []string{}, "execute templates that are (co-)created by the specified authors"),
|
||||
flagSet.VarP(&options.Protocols, "type", "pt", fmt.Sprintf("protocol types to be executed. Possible values: %s", templateTypes.GetSupportedProtocolTypes())),
|
||||
flagSet.VarP(&options.ExcludeProtocols, "exclude-type", "ept", fmt.Sprintf("protocol types to not be executed. Possible values: %s", templateTypes.GetSupportedProtocolTypes())),
|
||||
flagSet.NormalizedStringSliceVarP(&options.Authors, "author", "a", []string{}, "execute templates that are (co-)created by the specified authors"),
|
||||
)
|
||||
|
||||
createGroup(flagSet, "output", "Output",
|
||||
@ -80,6 +85,7 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.BoolVarP(&options.NoMeta, "no-meta", "nm", false, "don't display match metadata"),
|
||||
flagSet.BoolVarP(&options.NoTimestamp, "no-timestamp", "nts", false, "don't display timestamp metadata in CLI output"),
|
||||
flagSet.StringVarP(&options.ReportingDB, "report-db", "rdb", "", "local nuclei reporting database (always use this to persist report data)"),
|
||||
flagSet.BoolVarP(&options.MatcherStatus, "matcher-status", "ms", false, "show optional match failure status"),
|
||||
flagSet.StringVarP(&options.MarkdownExportDirectory, "markdown-export", "me", "", "directory to export results in markdown format"),
|
||||
flagSet.StringVarP(&options.SarifExport, "sarif-export", "se", "", "file to export results in SARIF format"),
|
||||
)
|
||||
@ -93,6 +99,9 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.BoolVarP(&options.SystemResolvers, "system-resolvers", "sr", false, "use system DNS resolving as error fallback"),
|
||||
flagSet.BoolVar(&options.OfflineHTTP, "passive", false, "enable passive HTTP response processing mode"),
|
||||
flagSet.BoolVarP(&options.EnvironmentVariables, "env-vars", "ev", false, "enable environment variables to be used in template"),
|
||||
flagSet.StringVarP(&options.ClientCertFile, "client-cert", "cc", "", "client certificate file (PEM-encoded) used for authenticating against scanned hosts"),
|
||||
flagSet.StringVarP(&options.ClientKeyFile, "client-key", "ck", "", "client key file (PEM-encoded) used for authenticating against scanned hosts"),
|
||||
flagSet.StringVarP(&options.ClientCAFile, "client-ca", "ca", "", "client certificate authority file (PEM-encoded) used for authenticating against scanned hosts"),
|
||||
)
|
||||
|
||||
createGroup(flagSet, "interactsh", "interactsh",
|
||||
@ -101,7 +110,7 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.IntVar(&options.InteractionsCacheSize, "interactions-cache-size", 5000, "number of requests to keep in the interactions cache"),
|
||||
flagSet.IntVar(&options.InteractionsEviction, "interactions-eviction", 60, "number of seconds to wait before evicting requests from cache"),
|
||||
flagSet.IntVar(&options.InteractionsPollDuration, "interactions-poll-duration", 5, "number of seconds to wait before each interaction poll request"),
|
||||
flagSet.IntVar(&options.InteractionsColldownPeriod, "interactions-cooldown-period", 5, "extra time for interaction polling before exiting"),
|
||||
flagSet.IntVar(&options.InteractionsCoolDownPeriod, "interactions-cooldown-period", 5, "extra time for interaction polling before exiting"),
|
||||
flagSet.BoolVarP(&options.NoInteractsh, "no-interactsh", "ni", false, "disable interactsh server for OAST testing, exclude OAST based templates"),
|
||||
)
|
||||
|
||||
@ -110,6 +119,8 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.IntVarP(&options.RateLimitMinute, "rate-limit-minute", "rlm", 0, "maximum number of requests to send per minute"),
|
||||
flagSet.IntVarP(&options.BulkSize, "bulk-size", "bs", 25, "maximum number of hosts to be analyzed in parallel per template"),
|
||||
flagSet.IntVarP(&options.TemplateThreads, "concurrency", "c", 25, "maximum number of templates to be executed in parallel"),
|
||||
flagSet.IntVarP(&options.HeadlessBulkSize, "headless-bulk-size", "hbs", 10, "maximum number of headless hosts to be analyzed in parallel per template"),
|
||||
flagSet.IntVarP(&options.HeadlessTemplateThreads, "headless-concurrency", "hc", 10, "maximum number of headless templates to be executed in parallel"),
|
||||
)
|
||||
|
||||
createGroup(flagSet, "optimization", "Optimizations",
|
||||
@ -123,7 +134,7 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
)
|
||||
|
||||
createGroup(flagSet, "headless", "Headless",
|
||||
flagSet.BoolVar(&options.Headless, "headless", false, "enable templates that require headless browser support"),
|
||||
flagSet.BoolVar(&options.Headless, "headless", false, "enable templates that require headless browser support (root user on linux will disable sandbox)"),
|
||||
flagSet.IntVar(&options.PageTimeout, "page-timeout", 20, "seconds to wait for each page in headless mode"),
|
||||
flagSet.BoolVarP(&options.ShowBrowser, "show-browser", "sb", false, "show the browser on the screen when running templates with headless mode"),
|
||||
flagSet.BoolVarP(&options.UseInstalledChrome, "system-chrome", "sc", false, "Use local installed chrome browser instead of nuclei installed"),
|
||||
@ -133,12 +144,9 @@ on extensive configurability, massive extensibility and ease of use.`)
|
||||
flagSet.BoolVar(&options.Debug, "debug", false, "show all requests and responses"),
|
||||
flagSet.BoolVar(&options.DebugRequests, "debug-req", false, "show all sent requests"),
|
||||
flagSet.BoolVar(&options.DebugResponse, "debug-resp", false, "show all received responses"),
|
||||
|
||||
/* TODO why the separation? http://proxy:port vs socks5://proxy:port etc
|
||||
TODO should auto-set the HTTP_PROXY variable for the process? */
|
||||
flagSet.StringVarP(&options.ProxyURL, "proxy-url", "proxy", "", "URL of the HTTP proxy server"),
|
||||
flagSet.StringVar(&options.ProxySocksURL, "proxy-socks-url", "", "URL of the SOCKS proxy server"),
|
||||
flagSet.NormalizedStringSliceVarP(&options.Proxy, "proxy", "p", []string{}, "List of HTTP(s)/SOCKS5 proxy to use (comma separated or file input)"),
|
||||
flagSet.StringVarP(&options.TraceLogFile, "trace-log", "tlog", "", "file to write sent requests trace log"),
|
||||
flagSet.StringVarP(&options.ErrorLogFile, "error-log", "elog", "", "file to write sent requests error log"),
|
||||
flagSet.BoolVar(&options.Version, "version", false, "show nuclei version"),
|
||||
flagSet.BoolVarP(&options.Verbose, "verbose", "v", false, "show verbose output"),
|
||||
flagSet.BoolVar(&options.VerboseVerbose, "vv", false, "display templates loaded for scan"),
|
||||
@ -175,10 +183,3 @@ func createGroup(flagSet *goflags.FlagSet, groupName, description string, flags
|
||||
currentFlag.Group(groupName)
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
HacktoberFest update: Below, you can find our ticket recommendations. Tasks with the "good first issue" label are suitable for first time contributors. If you have other ideas, or need help with getting started, join our Discord channel or reach out to @forgedhallpass.
|
||||
|
||||
https://github.com/issues?q=is%3Aopen+is%3Aissue+user%3Aprojectdiscovery+label%3AHacktoberfest
|
||||
|
||||
*/
|
||||
|
||||
65
v2/go.mod
65
v2/go.mod
@ -5,79 +5,79 @@ go 1.17
|
||||
require (
|
||||
github.com/Ice3man543/nvd v1.0.8
|
||||
github.com/Knetic/govaluate v3.0.1-0.20171022003610-9aa49832a739+incompatible
|
||||
github.com/akrylysov/pogreb v0.10.1 // indirect
|
||||
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c
|
||||
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725
|
||||
github.com/andygrunwald/go-jira v1.14.0
|
||||
github.com/antchfx/htmlquery v1.2.3
|
||||
github.com/antchfx/htmlquery v1.2.4
|
||||
github.com/apex/log v1.9.0
|
||||
github.com/blang/semver v3.5.1+incompatible
|
||||
github.com/bluele/gcache v0.0.2
|
||||
github.com/c4milo/unpackit v0.1.0 // indirect
|
||||
github.com/corpix/uarand v0.1.1
|
||||
github.com/go-rod/rod v0.101.7
|
||||
github.com/go-playground/validator/v10 v10.9.0
|
||||
github.com/go-rod/rod v0.101.8
|
||||
github.com/gobwas/ws v1.1.0
|
||||
github.com/google/go-github v17.0.0+incompatible
|
||||
github.com/gosuri/uilive v0.0.4 // indirect
|
||||
github.com/gosuri/uiprogress v0.0.1 // indirect
|
||||
github.com/itchyny/gojq v0.12.4
|
||||
github.com/itchyny/gojq v0.12.5
|
||||
github.com/json-iterator/go v1.1.12
|
||||
github.com/julienschmidt/httprouter v1.3.0
|
||||
github.com/karlseguin/ccache v2.0.3+incompatible
|
||||
github.com/karrick/godirwalk v1.16.1
|
||||
github.com/logrusorgru/aurora v2.0.3+incompatible
|
||||
github.com/mattn/go-runewidth v0.0.13 // indirect
|
||||
github.com/miekg/dns v1.1.43
|
||||
github.com/olekukonko/tablewriter v0.0.5
|
||||
github.com/owenrumney/go-sarif v1.0.11
|
||||
github.com/pkg/errors v0.9.1
|
||||
github.com/projectdiscovery/clistats v0.0.8
|
||||
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e
|
||||
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345
|
||||
github.com/projectdiscovery/fastdialer v0.0.13
|
||||
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a
|
||||
github.com/projectdiscovery/gologger v1.1.4
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa
|
||||
github.com/projectdiscovery/interactsh v0.0.6
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df
|
||||
github.com/projectdiscovery/rawhttp v0.0.7
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660
|
||||
github.com/projectdiscovery/retryablehttp-go v1.0.2
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.2
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.3-0.20211126104922-00d2c6bb43b6
|
||||
github.com/remeh/sizedwaitgroup v1.0.0
|
||||
github.com/rs/xid v1.3.0
|
||||
github.com/segmentio/ksuid v1.0.4
|
||||
github.com/shirou/gopsutil/v3 v3.21.7
|
||||
github.com/shirou/gopsutil/v3 v3.21.9
|
||||
github.com/spaolacci/murmur3 v1.1.0
|
||||
github.com/spf13/cast v1.4.1
|
||||
github.com/stretchr/testify v1.7.0
|
||||
github.com/syndtr/goleveldb v1.0.0
|
||||
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible
|
||||
github.com/valyala/fasttemplate v1.2.1
|
||||
github.com/xanzy/go-gitlab v0.50.3
|
||||
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95
|
||||
github.com/xanzy/go-gitlab v0.51.1
|
||||
github.com/ysmood/gson v0.6.4 // indirect
|
||||
github.com/ysmood/leakless v0.7.0 // indirect
|
||||
go.uber.org/atomic v1.9.0
|
||||
go.uber.org/multierr v1.7.0
|
||||
go.uber.org/ratelimit v0.2.0
|
||||
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8
|
||||
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab
|
||||
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 // indirect
|
||||
golang.org/x/net v0.0.0-20211020060615-d418f374d309
|
||||
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1
|
||||
golang.org/x/text v0.3.7
|
||||
google.golang.org/appengine v1.6.7 // indirect
|
||||
gopkg.in/yaml.v2 v2.4.0
|
||||
moul.io/http2curl v1.0.0
|
||||
)
|
||||
|
||||
require github.com/projectdiscovery/folderutil v0.0.0-20211203091551-e81604e6940e
|
||||
|
||||
require (
|
||||
git.mills.io/prologic/smtpd v0.0.0-20210710122116-a525b76c287a // indirect
|
||||
github.com/PuerkitoBio/goquery v1.6.0 // indirect
|
||||
github.com/StackExchange/wmi v1.2.1 // indirect
|
||||
github.com/akrylysov/pogreb v0.10.1 // indirect
|
||||
github.com/andres-erbsen/clock v0.0.0-20160526145045-9e14626cd129 // indirect
|
||||
github.com/andybalholm/cascadia v1.1.0 // indirect
|
||||
github.com/antchfx/xpath v1.1.6 // indirect
|
||||
github.com/aymerick/douceur v0.2.0 // indirect
|
||||
github.com/antchfx/xpath v1.2.0 // indirect
|
||||
github.com/bits-and-blooms/bitset v1.2.0 // indirect
|
||||
github.com/bits-and-blooms/bloom/v3 v3.0.1 // indirect
|
||||
github.com/c4milo/unpackit v0.1.0 // indirect
|
||||
github.com/cnf/structhash v0.0.0-20201127153200-e1b16c1ebc08 // indirect
|
||||
github.com/davecgh/go-spew v1.1.1 // indirect
|
||||
github.com/dimchansky/utfbom v1.1.1 // indirect
|
||||
@ -85,13 +85,18 @@ require (
|
||||
github.com/eggsampler/acme/v3 v3.2.1 // indirect
|
||||
github.com/fatih/structs v1.1.0 // indirect
|
||||
github.com/go-ole/go-ole v1.2.5 // indirect
|
||||
github.com/go-playground/locales v0.14.0 // indirect
|
||||
github.com/go-playground/universal-translator v0.18.0 // indirect
|
||||
github.com/gobwas/httphead v0.1.0 // indirect
|
||||
github.com/gobwas/pool v0.2.1 // indirect
|
||||
github.com/golang-jwt/jwt v3.2.1+incompatible // indirect
|
||||
github.com/golang/groupcache v0.0.0-20200121045136-8c9f03a8e57e // indirect
|
||||
github.com/golang/protobuf v1.5.2 // indirect
|
||||
github.com/golang/snappy v0.0.4 // indirect
|
||||
github.com/google/go-querystring v1.0.0 // indirect
|
||||
github.com/google/uuid v1.3.0 // indirect
|
||||
github.com/gorilla/css v1.0.0 // indirect
|
||||
github.com/gosuri/uilive v0.0.4 // indirect
|
||||
github.com/gosuri/uiprogress v0.0.1 // indirect
|
||||
github.com/hashicorp/go-cleanhttp v0.5.1 // indirect
|
||||
github.com/hashicorp/go-retryablehttp v0.6.8 // indirect
|
||||
github.com/iancoleman/orderedmap v0.0.0-20190318233801-ac98e3ecb4b0 // indirect
|
||||
@ -100,20 +105,19 @@ require (
|
||||
github.com/karlseguin/ccache/v2 v2.0.8 // indirect
|
||||
github.com/klauspost/compress v1.13.6 // indirect
|
||||
github.com/klauspost/pgzip v1.2.5 // indirect
|
||||
github.com/leodido/go-urn v1.2.1 // indirect
|
||||
github.com/mattn/go-isatty v0.0.13 // indirect
|
||||
github.com/microcosm-cc/bluemonday v1.0.15 // indirect
|
||||
github.com/mattn/go-runewidth v0.0.13 // indirect
|
||||
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
|
||||
github.com/modern-go/reflect2 v1.0.2 // indirect
|
||||
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||
github.com/projectdiscovery/blackrock v0.0.0-20210415162320-b38689ae3a2e // indirect
|
||||
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345 // indirect
|
||||
github.com/projectdiscovery/iputil v0.0.0-20210804143329-3a30fcde43f3 // indirect
|
||||
github.com/projectdiscovery/mapcidr v0.0.8 // indirect
|
||||
github.com/projectdiscovery/networkpolicy v0.0.1 // indirect
|
||||
github.com/rivo/uniseg v0.2.0 // indirect
|
||||
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca // indirect
|
||||
github.com/tklauser/go-sysconf v0.3.7 // indirect
|
||||
github.com/tklauser/numcpus v0.2.3 // indirect
|
||||
github.com/tklauser/go-sysconf v0.3.9 // indirect
|
||||
github.com/tklauser/numcpus v0.3.0 // indirect
|
||||
github.com/trivago/tgo v1.0.7 // indirect
|
||||
github.com/ulikunitz/xz v0.5.10 // indirect
|
||||
github.com/valyala/bytebufferpool v1.0.0 // indirect
|
||||
@ -121,7 +125,10 @@ require (
|
||||
github.com/ysmood/goob v0.3.0 // indirect
|
||||
github.com/zclconf/go-cty v1.8.4 // indirect
|
||||
go.etcd.io/bbolt v1.3.6 // indirect
|
||||
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97 // indirect
|
||||
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 // indirect
|
||||
golang.org/x/time v0.0.0-20191024005414-555d28b269f0 // indirect
|
||||
google.golang.org/appengine v1.6.7 // indirect
|
||||
google.golang.org/protobuf v1.27.1 // indirect
|
||||
gopkg.in/corvus-ch/zbase32.v1 v1.0.0 // indirect
|
||||
gopkg.in/yaml.v3 v3.0.0-20210107192922-496545a6307b // indirect
|
||||
|
||||
95
v2/go.sum
95
v2/go.sum
@ -67,8 +67,9 @@ github.com/ajg/form v1.5.1/go.mod h1:uL1WgH+h2mgNtvBq0339dVnzXdBETtL2LeUXaIv25UY
|
||||
github.com/akrylysov/pogreb v0.10.0/go.mod h1:pNs6QmpQ1UlTJKDezuRWmaqkgUE2TuU0YTWyqJZ7+lI=
|
||||
github.com/akrylysov/pogreb v0.10.1 h1:FqlR8VR7uCbJdfUob916tPM+idpKgeESDXOA1K0DK4w=
|
||||
github.com/akrylysov/pogreb v0.10.1/go.mod h1:pNs6QmpQ1UlTJKDezuRWmaqkgUE2TuU0YTWyqJZ7+lI=
|
||||
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c h1:oJsq4z4xKgZWWOhrSZuLZ5KyYfRFytddLL1E5+psfIY=
|
||||
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c/go.mod h1:/n6+1/DWPltRLWL/VKyUxg6tzsl5kHUCcraimt4vr60=
|
||||
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725 h1:NjwIgLQlD46o79bheVG4SCdRnnOz4XtgUN1WABX5DLA=
|
||||
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725/go.mod h1:/n6+1/DWPltRLWL/VKyUxg6tzsl5kHUCcraimt4vr60=
|
||||
github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
|
||||
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
|
||||
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
|
||||
@ -79,10 +80,12 @@ github.com/andybalholm/cascadia v1.1.0 h1:BuuO6sSfQNFRu1LppgbD25Hr2vLYW25JvxHs5z
|
||||
github.com/andybalholm/cascadia v1.1.0/go.mod h1:GsXiBklL0woXo1j/WYWtSYYC4ouU9PqHO0sqidkEA4Y=
|
||||
github.com/andygrunwald/go-jira v1.14.0 h1:7GT/3qhar2dGJ0kq8w0d63liNyHOnxZsUZ9Pe4+AKBI=
|
||||
github.com/andygrunwald/go-jira v1.14.0/go.mod h1:KMo2f4DgMZA1C9FdImuLc04x4WQhn5derQpnsuBFgqE=
|
||||
github.com/antchfx/htmlquery v1.2.3 h1:sP3NFDneHx2stfNXCKbhHFo8XgNjCACnU/4AO5gWz6M=
|
||||
github.com/antchfx/htmlquery v1.2.3/go.mod h1:B0ABL+F5irhhMWg54ymEZinzMSi0Kt3I2if0BLYa3V0=
|
||||
github.com/antchfx/xpath v1.1.6 h1:6sVh6hB5T6phw1pFpHRQ+C4bd8sNI+O58flqtg7h0R0=
|
||||
github.com/antchfx/htmlquery v1.2.4 h1:qLteofCMe/KGovBI6SQgmou2QNyedFUW+pE+BpeZ494=
|
||||
github.com/antchfx/htmlquery v1.2.4/go.mod h1:2xO6iu3EVWs7R2JYqBbp8YzG50gj/ofqs5/0VZoDZLc=
|
||||
github.com/antchfx/xpath v1.1.6/go.mod h1:Yee4kTMuNiPYJ7nSNorELQMr1J33uOpXDMByNYhvtNk=
|
||||
github.com/antchfx/xpath v1.2.0 h1:mbwv7co+x0RwgeGAOHdrKy89GvHaGvxxBtPK0uF9Zr8=
|
||||
github.com/antchfx/xpath v1.2.0/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
|
||||
github.com/apache/thrift v0.12.0/go.mod h1:cp2SuWMxlEZw2r+iP2GNCdIi4C1qmUzdZFSVb+bacwQ=
|
||||
github.com/apache/thrift v0.13.0/go.mod h1:cp2SuWMxlEZw2r+iP2GNCdIi4C1qmUzdZFSVb+bacwQ=
|
||||
github.com/apex/log v1.9.0 h1:FHtw/xuaM8AgmvDDTI9fiwoAL25Sq2cxojnZICUU8l0=
|
||||
@ -96,14 +99,11 @@ github.com/armon/consul-api v0.0.0-20180202201655-eb2c6b5be1b6/go.mod h1:grANhF5
|
||||
github.com/armon/go-metrics v0.0.0-20180917152333-f0300d1749da/go.mod h1:Q73ZrmVTwzkszR9V5SSuryQ31EELlFMUz1kKyl939pY=
|
||||
github.com/armon/go-radix v0.0.0-20180808171621-7fddfc383310/go.mod h1:ufUuZ+zHj4x4TnLV4JWEpy2hxWSpsRywHrMgIH9cCH8=
|
||||
github.com/aryann/difflib v0.0.0-20170710044230-e206f873d14a/go.mod h1:DAHtR1m6lCRdSC2Tm3DSWRPvIPr6xNKyeHdqDQSQT+A=
|
||||
github.com/asaskevich/govalidator v0.0.0-20210307081110-f21760c49a8d/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
|
||||
github.com/aws/aws-lambda-go v1.13.3/go.mod h1:4UKl9IzQMoD+QF79YdCuzCwp8VbmG4VAQwij/eHl5CU=
|
||||
github.com/aws/aws-sdk-go v1.20.6/go.mod h1:KmX6BPdI08NWTb3/sm4ZGu5ShLoqVDhKgpiN924inxo=
|
||||
github.com/aws/aws-sdk-go v1.27.0/go.mod h1:KmX6BPdI08NWTb3/sm4ZGu5ShLoqVDhKgpiN924inxo=
|
||||
github.com/aws/aws-sdk-go-v2 v0.18.0/go.mod h1:JWVYvqSMppoMJC0x5wdwiImzgXTI9FuZwxzkQq9wy+g=
|
||||
github.com/aybabtme/rgbterm v0.0.0-20170906152045-cc83f3b3ce59/go.mod h1:q/89r3U2H7sSsE2t6Kca0lfwTK8JdoNGS/yzM/4iH5I=
|
||||
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
|
||||
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
|
||||
github.com/aymerick/raymond v2.0.3-0.20180322193309-b565731e1464+incompatible/go.mod h1:osfaiScAUVup+UC9Nfq76eWqDhXlp+4UYaA8uhTBO6g=
|
||||
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
|
||||
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
|
||||
@ -230,16 +230,30 @@ github.com/go-logr/logr v0.4.0/go.mod h1:z6/tIYblkpsD+a4lm/fGIIU9mZ+XfAiaFtq7xTg
|
||||
github.com/go-martini/martini v0.0.0-20170121215854-22fa46961aab/go.mod h1:/P9AEU963A2AYjv4d1V5eVL1CQbEJq6aCNHDDjibzu8=
|
||||
github.com/go-ole/go-ole v1.2.5 h1:t4MGB5xEDZvXI+0rMjjsfBsD7yAgp/s9ZDkL1JndXwY=
|
||||
github.com/go-ole/go-ole v1.2.5/go.mod h1:pprOEPIfldk/42T2oK7lQ4v4JSDwmV0As9GaiUsvbm0=
|
||||
github.com/go-playground/assert/v2 v2.0.1 h1:MsBgLAaY856+nPRTKrp3/OZK38U/wa0CcBYNjji3q3A=
|
||||
github.com/go-playground/assert/v2 v2.0.1/go.mod h1:VDjEfimB/XKnb+ZQfWdccd7VUvScMdVu0Titje2rxJ4=
|
||||
github.com/go-playground/locales v0.14.0 h1:u50s323jtVGugKlcYeyzC0etD1HifMjqmJqb8WugfUU=
|
||||
github.com/go-playground/locales v0.14.0/go.mod h1:sawfccIbzZTqEDETgFXqTho0QybSa7l++s0DH+LDiLs=
|
||||
github.com/go-playground/universal-translator v0.18.0 h1:82dyy6p4OuJq4/CByFNOn/jYrnRPArHwAcmLoJZxyho=
|
||||
github.com/go-playground/universal-translator v0.18.0/go.mod h1:UvRDBj+xPUEGrFYl+lu/H90nyDXpg0fqeB/AQUGNTVA=
|
||||
github.com/go-playground/validator/v10 v10.9.0 h1:NgTtmN58D0m8+UuxtYmGztBJB7VnPgjj221I1QHci2A=
|
||||
github.com/go-playground/validator/v10 v10.9.0/go.mod h1:74x4gJWsvQexRdW8Pn3dXSGrTK4nAUsbPlLADvpJkos=
|
||||
github.com/go-redis/redis v6.15.5+incompatible/go.mod h1:NAIEuMOZ/fxfXJIrKDQDz8wamY7mA7PouImQ2Jvg6kA=
|
||||
github.com/go-rod/rod v0.91.1/go.mod h1:/W4lcZiCALPD603MnJGIvhtywP3R6yRB9EDfFfsHiiI=
|
||||
github.com/go-rod/rod v0.101.7 h1:kbI5CNvcRhf7feybBln4xDutsM0mbsF0ENNZfKcF6WA=
|
||||
github.com/go-rod/rod v0.101.7/go.mod h1:N/zlT53CfSpq74nb6rOR0K8UF0SPUPBmzBnArrms+mY=
|
||||
github.com/go-rod/rod v0.101.8 h1:oV0O97uwjkCVyAP0hD6K6bBE8FUMIjs0dtF7l6kEBsU=
|
||||
github.com/go-rod/rod v0.101.8/go.mod h1:N/zlT53CfSpq74nb6rOR0K8UF0SPUPBmzBnArrms+mY=
|
||||
github.com/go-sql-driver/mysql v1.4.0/go.mod h1:zAC/RDZ24gD3HViQzih4MyKcchzm+sOG5ZlKdlhCg5w=
|
||||
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
|
||||
github.com/go-task/slim-sprig v0.0.0-20210107165309-348f09dbbbc0/go.mod h1:fyg7847qk6SyHyPtNmDHnmrv/HOrqktSC+C9fM+CJOE=
|
||||
github.com/gobwas/httphead v0.0.0-20180130184737-2c6c146eadee/go.mod h1:L0fX3K22YWvt/FAX9NnzrNzcI4wNYi9Yku4O0LKYflo=
|
||||
github.com/gobwas/httphead v0.1.0 h1:exrUm0f4YX0L7EBwZHuCF4GDp8aJfVeBrlLQrs6NqWU=
|
||||
github.com/gobwas/httphead v0.1.0/go.mod h1:O/RXo79gxV8G+RqlR/otEwx4Q36zl9rqC5u12GKvMCM=
|
||||
github.com/gobwas/pool v0.2.0/go.mod h1:q8bcK0KcYlCgd9e7WYLm9LpyS+YeLd8JVDW6WezmKEw=
|
||||
github.com/gobwas/pool v0.2.1 h1:xfeeEhW7pwmX8nuLVlqbzVc7udMDrwetjEv+TZIz1og=
|
||||
github.com/gobwas/pool v0.2.1/go.mod h1:q8bcK0KcYlCgd9e7WYLm9LpyS+YeLd8JVDW6WezmKEw=
|
||||
github.com/gobwas/ws v1.0.2/go.mod h1:szmBTxLgaFppYjEmNtny/v3w89xOydFnnZMcgRRu/EM=
|
||||
github.com/gobwas/ws v1.1.0 h1:7RFti/xnNkMJnrK7D1yQ/iCIB5OrrY/54/H930kIbHA=
|
||||
github.com/gobwas/ws v1.1.0/go.mod h1:nzvNcVha5eUziGrbxFCo6qFIojQHjJV5cLYIbezhfL0=
|
||||
github.com/gogo/googleapis v0.0.0-20180223154316-0cd9801be74a/go.mod h1:gf4bu3Q80BeJ6H1S1vYPm8/ELATdvryBaNFGgqEef3s=
|
||||
github.com/gogo/googleapis v1.1.0/go.mod h1:gf4bu3Q80BeJ6H1S1vYPm8/ELATdvryBaNFGgqEef3s=
|
||||
github.com/gogo/googleapis v1.4.1/go.mod h1:2lpHqI5OcWCtVElxXnPt+s8oJvMpySlOyM6xDCrzib4=
|
||||
@ -328,8 +342,6 @@ github.com/googleapis/gax-go/v2 v2.0.5/go.mod h1:DWXyrwAJ9X0FpwwEdw+IPEYBICEFu5m
|
||||
github.com/gopherjs/gopherjs v0.0.0-20181017120253-0766667cb4d1 h1:EGx4pi6eqNxGaHF6qqu48+N2wcFQ5qg5FXgOdqsJ5d8=
|
||||
github.com/gopherjs/gopherjs v0.0.0-20181017120253-0766667cb4d1/go.mod h1:wJfORRmW1u3UXTncJ5qlYoELFm8eSnnEO6hX4iZ3EWY=
|
||||
github.com/gorilla/context v1.1.1/go.mod h1:kBGZzfjB9CEq2AlWe17Uuf7NDRt0dE0s8S51q0aT7Yg=
|
||||
github.com/gorilla/css v1.0.0 h1:BQqNyPTi50JCFMTw/b67hByjMVXZRwGha6wxVGkeihY=
|
||||
github.com/gorilla/css v1.0.0/go.mod h1:Dn721qIggHpt4+EFCcTLTU/vk5ySda2ReITrtgBl60c=
|
||||
github.com/gorilla/mux v1.6.2/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
|
||||
github.com/gorilla/mux v1.7.3/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
|
||||
github.com/gorilla/websocket v0.0.0-20170926233335-4201258b820c/go.mod h1:E7qHFY5m1UJ88s3WnNqhKjPHQ0heANvMoAMk2YaljkQ=
|
||||
@ -385,8 +397,9 @@ github.com/iris-contrib/go.uuid v2.0.0+incompatible/go.mod h1:iz2lgM/1UnEf1kP0L/
|
||||
github.com/iris-contrib/i18n v0.0.0-20171121225848-987a633949d0/go.mod h1:pMCz62A0xJL6I+umB2YTlFRwWXaDFA0jy+5HzGiJjqI=
|
||||
github.com/iris-contrib/schema v0.0.1/go.mod h1:urYA3uvUNG1TIIjOSCzHr9/LmbQo8LrOcOqfqxa4hXw=
|
||||
github.com/itchyny/go-flags v1.5.0/go.mod h1:lenkYuCobuxLBAd/HGFE4LRoW8D3B6iXRQfWYJ+MNbA=
|
||||
github.com/itchyny/gojq v0.12.4 h1:8zgOZWMejEWCLjbF/1mWY7hY7QEARm7dtuhC6Bp4R8o=
|
||||
github.com/itchyny/gojq v0.12.4/go.mod h1:EQUSKgW/YaOxmXpAwGiowFDO4i2Rmtk5+9dFyeiymAg=
|
||||
github.com/itchyny/gojq v0.12.5 h1:6SJ1BQ1VAwJAlIvLSIZmqHP/RUEq3qfVWvsRxrqhsD0=
|
||||
github.com/itchyny/gojq v0.12.5/go.mod h1:3e1hZXv+Kwvdp6V9HXpVrvddiHVApi5EDZwS+zLFeiE=
|
||||
github.com/itchyny/timefmt-go v0.1.3 h1:7M3LGVDsqcd0VZH2U+x393obrzZisp7C0uEe921iRkU=
|
||||
github.com/itchyny/timefmt-go v0.1.3/go.mod h1:0osSSCQSASBJMsIZnhAaF1C2fCBTJZXrnj37mG8/c+A=
|
||||
github.com/jasonlvhit/gocron v0.0.1 h1:qTt5qF3b3srDjeOIR4Le1LfeyvoYzJlYpqvG7tJX5YU=
|
||||
@ -451,6 +464,8 @@ github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/labstack/echo/v4 v4.1.11/go.mod h1:i541M3Fj6f76NZtHSj7TXnyM8n2gaodfvfxNnFqi74g=
|
||||
github.com/labstack/gommon v0.3.0/go.mod h1:MULnywXg0yavhxWKc+lOruYdAhDwPK9wf0OL7NoOu+k=
|
||||
github.com/leodido/go-urn v1.2.1 h1:BqpAaACuzVSgi/VLzGZIobT2z4v53pjosyNd9Yv6n/w=
|
||||
github.com/leodido/go-urn v1.2.1/go.mod h1:zt4jvISO2HfUBqxjfIshjdMTYS56ZS/qv49ictyFfxY=
|
||||
github.com/lightstep/lightstep-tracer-common/golang/gogo v0.0.0-20190605223551-bc2310a04743/go.mod h1:qklhhLq1aX+mtWk9cPHPzaBjWImj5ULL6C7HFJtXQMM=
|
||||
github.com/lightstep/lightstep-tracer-go v0.18.1/go.mod h1:jlF1pusYV4pidLvZ+XD0UBX0ZE6WURAspgAczcDHrL4=
|
||||
github.com/logrusorgru/aurora v0.0.0-20200102142835-e9ef32dff381/go.mod h1:7rIyQOR62GCctdiQpZ/zOJlFyk6y+94wXzv6RNZgaR4=
|
||||
@ -479,8 +494,6 @@ github.com/mediocregopher/mediocre-go-lib v0.0.0-20181029021733-cb65787f37ed/go.
|
||||
github.com/mediocregopher/radix/v3 v3.3.0/go.mod h1:EmfVyvspXz1uZEyPBMyGK+kjWiKQGvsUt6O3Pj+LDCQ=
|
||||
github.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b/go.mod h1:01TrycV0kFyexm33Z7vhZRXopbI8J3TDReVlkTgMUxE=
|
||||
github.com/microcosm-cc/bluemonday v1.0.2/go.mod h1:iVP4YcDBq+n/5fb23BhYFvIMq/leAFZyRl6bYmGDlGc=
|
||||
github.com/microcosm-cc/bluemonday v1.0.15 h1:J4uN+qPng9rvkBZBoBb8YGR+ijuklIMpSOZZLjYpbeY=
|
||||
github.com/microcosm-cc/bluemonday v1.0.15/go.mod h1:ZLvAzeakRwrGnzQEvstVzVt3ZpqOF2+sdFr0Om+ce30=
|
||||
github.com/miekg/dns v1.0.14/go.mod h1:W1PPwlIAgtquWBMBEV9nkV9Cazfe8ScdGz/Lj7v3Nrg=
|
||||
github.com/miekg/dns v1.1.29/go.mod h1:KNUDUusw/aVsxyTYZM1oqvCicbwhgbNgztCETuNZ7xM=
|
||||
github.com/miekg/dns v1.1.41/go.mod h1:p6aan82bvRIyn+zDIv9xYNUpwa73JcSh9BKwknJysuI=
|
||||
@ -573,17 +586,22 @@ github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345 h1:jT6
|
||||
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345/go.mod h1:clhQmPnt35ziJW1AhJRKyu8aygXCSoyWj6dtmZBRjjc=
|
||||
github.com/projectdiscovery/fastdialer v0.0.12/go.mod h1:RkRbxqDCcCFhfNUbkzBIz/ieD4uda2JuUA4WJ+RLee0=
|
||||
github.com/projectdiscovery/fastdialer v0.0.13-0.20210824195254-0113c1406542/go.mod h1:TuapmLiqtunJOxpM7g0tpTy/TUF/0S+XFyx0B0Wx0DQ=
|
||||
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e h1:xMAFYJgRxopAwKrj7HDwMBKJGCGDbHqopS8f959xges=
|
||||
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e/go.mod h1:O1l6+vAQy1QRo9FqyuyJ57W3CwpIXXg7oGo14Le6ZYQ=
|
||||
github.com/projectdiscovery/fastdialer v0.0.13 h1:BCe7JsFxRk1kAUQcy4X+9lqEuT7Y6LRSlHXfia03XOo=
|
||||
github.com/projectdiscovery/fastdialer v0.0.13/go.mod h1:Mex24omi3RxrmhA8Ote7rw+6LWMiaBvbJq8CNp0ksII=
|
||||
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08 h1:NwD1R/du1dqrRKN3SJl9kT6tN3K9puuWFXEvYF2ihew=
|
||||
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08/go.mod h1:paLCnwV8sL7ppqIwVQodQrk3F6mnWafwTDwRd7ywZwQ=
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210804142714-ebba15fa53ca/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210914153648-31f843feaad4/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210926202739-6050d0acf73c/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5 h1:2dbm7UhrAKnccZttr78CAmG768sSCd+MBn4ayLVDeqA=
|
||||
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
|
||||
github.com/projectdiscovery/folderutil v0.0.0-20210804143510-68474319fd84 h1:+VqGxv8ywpIHwGGSCOcGn/q5kkuB6F1AZtY42I8VnXc=
|
||||
github.com/projectdiscovery/folderutil v0.0.0-20210804143510-68474319fd84/go.mod h1:BMqXH4jNGByVdE2iLtKvc/6XStaiZRuCIaKv1vw9PnI=
|
||||
github.com/projectdiscovery/folderutil v0.0.0-20211203091551-e81604e6940e h1:ozfSeEc5j1f7NCEZAiAskP/KYfBD/TzPmFTIfh+CEwE=
|
||||
github.com/projectdiscovery/folderutil v0.0.0-20211203091551-e81604e6940e/go.mod h1:BMqXH4jNGByVdE2iLtKvc/6XStaiZRuCIaKv1vw9PnI=
|
||||
github.com/projectdiscovery/goflags v0.0.7/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240 h1:b7zDUSsgN5f4/IlhKF6RVGsp/NkHIuty0o1YjzAMKUs=
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a h1:EzwVm8i4zmzqZX55vrDtyfogwHh8AAZ3cWCJe4fEduk=
|
||||
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
|
||||
github.com/projectdiscovery/gologger v1.0.1/go.mod h1:Ok+axMqK53bWNwDSU1nTNwITLYMXMdZtRc8/y1c7sWE=
|
||||
github.com/projectdiscovery/gologger v1.1.4 h1:qWxGUq7ukHWT849uGPkagPKF3yBPYAsTtMKunQ8O2VI=
|
||||
github.com/projectdiscovery/gologger v1.1.4/go.mod h1:Bhb6Bdx2PV1nMaFLoXNBmHIU85iROS9y1tBuv7T5pMY=
|
||||
@ -591,7 +609,6 @@ github.com/projectdiscovery/hmap v0.0.1/go.mod h1:VDEfgzkKQdq7iGTKz8Ooul0NuYHQ8q
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210616215655-7b78e7f33d1f/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210727180307-d63d35146e97/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210825180603-fca7166c158f/go.mod h1:RLM8b1z2HEq74u5AXN1Lbvfq+1BZWpnTQJcwLnMLA54=
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210917073634-bfb0e9c03800/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa h1:9sZWFUAshIa/ea0RKjGRuuZiS5PzYXAFjTRUnSbezr0=
|
||||
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa/go.mod h1:lV5f/PNPmCCjCN/dR317/chN9s7VG5h/xcbFfXOz8Fo=
|
||||
github.com/projectdiscovery/interactsh v0.0.4/go.mod h1:PtJrddeBW1/LeOVgTvvnjUl3Hu/17jTkoIi8rXeEODE=
|
||||
@ -609,25 +626,26 @@ github.com/projectdiscovery/mapcidr v0.0.8 h1:16U05F2x3o/jSTsxSCY2hCuCs9xOSwVxjo
|
||||
github.com/projectdiscovery/mapcidr v0.0.8/go.mod h1:7CzdUdjuLVI0s33dQ33lWgjg3vPuLFw2rQzZ0RxkT00=
|
||||
github.com/projectdiscovery/networkpolicy v0.0.1 h1:RGRuPlxE8WLFF9tdKSjTsYiTIKHNHW20Kl0nGGiRb1I=
|
||||
github.com/projectdiscovery/networkpolicy v0.0.1/go.mod h1:asvdg5wMy3LPVMGALatebKeOYH5n5fV5RCTv6DbxpIs=
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77 h1:SNtAiRRrJtDJJDroaa/bFXt/Tix2LA6+rHRib0ORlJQ=
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77/go.mod h1:pxWVDgq88t9dWv4+J2AIaWgY+EqOE1AyfHS0Tn23w4M=
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df h1:CvTNAUD5JbLMqpMFoGNgfk2gOcN0NC57ICu0+oK84vs=
|
||||
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df/go.mod h1:pxWVDgq88t9dWv4+J2AIaWgY+EqOE1AyfHS0Tn23w4M=
|
||||
github.com/projectdiscovery/nuclei/v2 v2.5.1/go.mod h1:sU2qcY0MQFS0CqP1BgkR8ZnUyFhqK0BdnY6bvTKNjXY=
|
||||
github.com/projectdiscovery/rawhttp v0.0.7 h1:5m4peVgjbl7gqDcRYMTVEuX+Xs/nh76ohTkkvufucLg=
|
||||
github.com/projectdiscovery/rawhttp v0.0.7/go.mod h1:PQERZAhAv7yxI/hR6hdDPgK1WTU56l204BweXrBec+0=
|
||||
github.com/projectdiscovery/retryabledns v1.0.11/go.mod h1:4sMC8HZyF01HXukRleSQYwz4870bwgb4+hTSXTMrkf4=
|
||||
github.com/projectdiscovery/retryabledns v1.0.12/go.mod h1:4sMC8HZyF01HXukRleSQYwz4870bwgb4+hTSXTMrkf4=
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a h1:WJQjr9qi/VjWhdNiGyNqcFi0967Gp0W3I769bCpHOJE=
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a/go.mod h1:tXaLDs4n3pRZHwfa8mdXpUWe/AYDNK3HlWDjldhRbjI=
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660 h1:Ooa5htghPkdyfpzy6Y5KLdyv4w8ePZWmfzFSPQlJStQ=
|
||||
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660/go.mod h1:UfszkO3x+GLKVOpXB7boddJKbwNCr+tMPSkfgCSNhl4=
|
||||
github.com/projectdiscovery/retryablehttp-go v1.0.1/go.mod h1:SrN6iLZilNG1X4neq1D+SBxoqfAF4nyzvmevkTkWsek=
|
||||
github.com/projectdiscovery/retryablehttp-go v1.0.2 h1:LV1/KAQU+yeWhNVlvveaYFsjBYRwXlNEq0PvrezMV0U=
|
||||
github.com/projectdiscovery/retryablehttp-go v1.0.2/go.mod h1:dx//aY9V247qHdsRf0vdWHTBZuBQ2vm6Dq5dagxrDYI=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20210804142656-fd3c28dbaafe/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20210823090203-2f5f137e8e1d/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9 h1:xbL1/7h0k6HE3RzPdYk9W/8pUxESrGWewTaZdIB5Pes=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d h1:YBYwsm8MrSp9t7mLehyqGwUKZWB08fG+YRePQRo5iFw=
|
||||
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d/go.mod h1:JK4F9ACNPgO+Lbm80khX2q1ABInBMbwIOmbsEE61Sn4=
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.2 h1:SKb7PHgSOXm27Zci05ba0FxpyQiu6bGEiVMEcjCK1rQ=
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.2/go.mod h1:7uSxfMXaBmzvw8m5EhOEjB6nhz0rK/H9sUjq1ciZu24=
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.3-0.20211126104922-00d2c6bb43b6 h1:DvWRQpw7Ib2CRL3ogYm/BWM+X0UGPfz1n9Ix9YKgFM8=
|
||||
github.com/projectdiscovery/yamldoc-go v1.0.3-0.20211126104922-00d2c6bb43b6/go.mod h1:8OfZj8p/axkUM/TJoS/O9LDjj/S8u17rxRbqluE9CU4=
|
||||
github.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=
|
||||
github.com/prometheus/client_golang v0.9.3-0.20190127221311-3c4408c8b829/go.mod h1:p2iRAGwDERtqlqzRXnrOVns+ignqQo//hLXqYxZYVNs=
|
||||
github.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=
|
||||
@ -664,8 +682,6 @@ github.com/russross/blackfriday v1.5.2/go.mod h1:JO/DiYxRf+HjHt06OyowR9PTA263kcR
|
||||
github.com/russross/blackfriday/v2 v2.0.1/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
|
||||
github.com/ryanuber/columnize v0.0.0-20160712163229-9b3edd62028f/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
|
||||
github.com/ryanuber/columnize v2.1.0+incompatible/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
|
||||
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca h1:NugYot0LIVPxTvN8n+Kvkn6TrbMyxQiuvKdEwFdR9vI=
|
||||
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca/go.mod h1:uugorj2VCxiV1x+LzaIdVa9b4S4qGAcH6cbhh4qVxOU=
|
||||
github.com/samuel/go-zookeeper v0.0.0-20190923202752-2cc03de413da/go.mod h1:gi+0XIa01GRL2eRQVjQkKGqKF3SF9vZR/HnPullcV2E=
|
||||
github.com/sclevine/agouti v3.0.0+incompatible/go.mod h1:b4WX9W9L1sfQKXeJf1mUTLZKJ48R1S7H23Ji7oFO5Bw=
|
||||
github.com/sean-/seed v0.0.0-20170313163322-e2103e2c3529/go.mod h1:DxrIzT+xaE7yg65j358z/aeFdxmN0P9QXhEzd20vsDc=
|
||||
@ -674,8 +690,9 @@ github.com/segmentio/ksuid v1.0.4 h1:sBo2BdShXjmcugAMwjugoGUdUV0pcxY5mW4xKRn3v4c
|
||||
github.com/segmentio/ksuid v1.0.4/go.mod h1:/XUiZBD3kVx5SmUOl55voK5yeAbBNNIed+2O73XgrPE=
|
||||
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
|
||||
github.com/sergi/go-diff v1.1.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=
|
||||
github.com/shirou/gopsutil/v3 v3.21.7 h1:PnTqQamUjwEDSgn+nBGu0qSDV/CfvyiR/gwTH3i7HTU=
|
||||
github.com/shirou/gopsutil/v3 v3.21.7/go.mod h1:RGl11Y7XMTQPmHh8F0ayC6haKNBgH4PXMJuTAcMOlz4=
|
||||
github.com/shirou/gopsutil/v3 v3.21.9 h1:Vn4MUz2uXhqLSiCbGFRc0DILbMVLAY92DSkT8bsYrHg=
|
||||
github.com/shirou/gopsutil/v3 v3.21.9/go.mod h1:YWp/H8Qs5fVmf17v7JNZzA0mPJ+mS2e9JdiUF9LlKzQ=
|
||||
github.com/shurcooL/sanitized_anchor_name v1.0.0/go.mod h1:1NzhyTcUVG4SuEtjjoZeVRXNmyL/1OwPU0+IJeTBvfc=
|
||||
github.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=
|
||||
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
|
||||
@ -726,10 +743,12 @@ github.com/tj/go-kinesis v0.0.0-20171128231115-08b17f58cb1b/go.mod h1:/yhzCV0xPf
|
||||
github.com/tj/go-spin v1.1.0/go.mod h1:Mg1mzmePZm4dva8Qz60H2lHwmJ2loum4VIrLgVnKwh4=
|
||||
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible h1:guTq1YxwB8XSILkI9q4IrOmrCOS6Hc1L3hmOhi4Swcs=
|
||||
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible/go.mod h1:waFwwyiAhGey2e+dNoYQ/iLhIcFqhCW7zL/+vDU1WLo=
|
||||
github.com/tklauser/go-sysconf v0.3.7 h1:HT7h4+536gjqeq1ZIJPgOl1rg1XFatQGVZWp7Py53eg=
|
||||
github.com/tklauser/go-sysconf v0.3.7/go.mod h1:JZIdXh4RmBvZDBZ41ld2bGxRV3n4daiiqA3skYhAoQ4=
|
||||
github.com/tklauser/numcpus v0.2.3 h1:nQ0QYpiritP6ViFhrKYsiv6VVxOpum2Gks5GhnJbS/8=
|
||||
github.com/tklauser/go-sysconf v0.3.9 h1:JeUVdAOWhhxVcU6Eqr/ATFHgXk/mmiItdKeJPev3vTo=
|
||||
github.com/tklauser/go-sysconf v0.3.9/go.mod h1:11DU/5sG7UexIrp/O6g35hrWzu0JxlwQ3LSFUzyeuhs=
|
||||
github.com/tklauser/numcpus v0.2.3/go.mod h1:vpEPS/JC+oZGGQ/My/vJnNsvMDQL6PwOqt8dsCw5j+E=
|
||||
github.com/tklauser/numcpus v0.3.0 h1:ILuRUQBtssgnxw0XXIjKUC56fgnOrFoQQ/4+DeU2biQ=
|
||||
github.com/tklauser/numcpus v0.3.0/go.mod h1:yFGUr7TUHQRAhyqBcEg0Ge34zDBAsIvJJcyE6boqnA8=
|
||||
github.com/tmc/grpc-websocket-proxy v0.0.0-20170815181823-89b8d40f7ca8/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U=
|
||||
github.com/trivago/tgo v1.0.7 h1:uaWH/XIy9aWYWpjm2CU3RpcqZXmX2ysQ9/Go+d9gyrM=
|
||||
github.com/trivago/tgo v1.0.7/go.mod h1:w4dpD+3tzNIIiIfkWWa85w5/B77tlvdZckQ+6PkFnhc=
|
||||
@ -750,10 +769,13 @@ github.com/valyala/fasttemplate v1.2.1/go.mod h1:KHLXt3tVN2HBp8eijSv/kGJopbvo7S+
|
||||
github.com/valyala/tcplisten v0.0.0-20161114210144-ceec8f93295a/go.mod h1:v3UYOV9WzVtRmSR+PDvWpU/qWl4Wa5LApYYX4ZtKbio=
|
||||
github.com/vmihailenco/msgpack/v4 v4.3.12/go.mod h1:gborTTJjAo/GWTqqRjrLCn9pgNN+NXzzngzBKDPIqw4=
|
||||
github.com/vmihailenco/tagparser v0.1.1/go.mod h1:OeAg3pn3UbLjkWt+rN9oFYB6u/cQgqMEUPoW2WPyhdI=
|
||||
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95 h1:DyAZOw3JsVd6LJHqhl4MpKQdYQEmat0C6pPPwom39Ow=
|
||||
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95/go.mod h1:HYux0V0Zi04bHNwOHy4cXJVz/TQjYonnF6aoYhj+3QE=
|
||||
github.com/wsxiaoys/terminal v0.0.0-20160513160801-0940f3fc43a0 h1:3UeQBvD0TFrlVjOeLOBz+CPAI8dnbqNSVwUwRrkp7vQ=
|
||||
github.com/wsxiaoys/terminal v0.0.0-20160513160801-0940f3fc43a0/go.mod h1:IXCdmsXIht47RaVFLEdVnh1t+pgYtTAhQGj73kz+2DM=
|
||||
github.com/xanzy/go-gitlab v0.50.3 h1:M7ncgNhCN4jaFNyXxarJhCLa9Qi6fdmCxFFhMTQPZiY=
|
||||
github.com/xanzy/go-gitlab v0.50.3/go.mod h1:Q+hQhV508bDPoBijv7YjK/Lvlb4PhVhJdKqXVQrUoAE=
|
||||
github.com/xanzy/go-gitlab v0.51.1 h1:wWKLalwx4omxFoHh3PLs9zDgAD4GXDP/uoxwMRCSiWM=
|
||||
github.com/xanzy/go-gitlab v0.51.1/go.mod h1:Q+hQhV508bDPoBijv7YjK/Lvlb4PhVhJdKqXVQrUoAE=
|
||||
github.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f/go.mod h1:N2zxlSyiKSe5eX1tZViRH5QA0qijqEDrYZiPEAiq3wU=
|
||||
github.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415/go.mod h1:GwrjFmJcFw6At/Gs6z4yjiIwzuJ1/+UwLxMQDVQXShQ=
|
||||
github.com/xeipuuv/gojsonschema v1.2.0/go.mod h1:anYRn/JVcOK2ZgGU+IjEV4nwlhoK5sQluxsYJ78Id3Y=
|
||||
@ -834,6 +856,8 @@ golang.org/x/crypto v0.0.0-20190701094942-4def268fd1a4/go.mod h1:yigFU9vqHzYiE8U
|
||||
golang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
|
||||
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
|
||||
golang.org/x/crypto v0.0.0-20201112155050-0c6587e931a9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
|
||||
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97 h1:/UOmuWzQfxxo9UtlXMwuQU8CMgg1eZXqTRwkSQJWKOI=
|
||||
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
|
||||
golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
|
||||
golang.org/x/exp v0.0.0-20190306152737-a1d7652674e8/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
|
||||
golang.org/x/exp v0.0.0-20190510132918-efd6b22b2522/go.mod h1:ZjyILWgesfNpC6sMxTJOJm9Kp84zZh5NQWvqDGG3Qr8=
|
||||
@ -922,16 +946,18 @@ golang.org/x/net v0.0.0-20210521195947-fe42d452be8f/go.mod h1:9nx3DQGgdP8bBQD5qx
|
||||
golang.org/x/net v0.0.0-20210614182718-04defd469f4e/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.0.0-20210813160813-60bc85c4be6d/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.0.0-20210825183410-e898025ed96a/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8 h1:/6y1LfuqNuQdHAm0jjtPtgRcxIxjVZgm5OTu8/QhZvk=
|
||||
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.0.0-20211020060615-d418f374d309 h1:A0lJIi+hcTR6aajJH4YqKWwohY4aW9RO7oRMcdv+HKI=
|
||||
golang.org/x/net v0.0.0-20211020060615-d418f374d309/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||
golang.org/x/oauth2 v0.0.0-20181106182150-f42d05182288/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||
golang.org/x/oauth2 v0.0.0-20191202225959-858c2ad4c8b6/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab h1:llrcWN/wOwO+6gAyfBzxb5hZ+c3mriU/0+KNgYu6adA=
|
||||
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
|
||||
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1 h1:B333XXssMuKQeBwiNODx4TupZy7bf4sxFZnN2ZOcvUE=
|
||||
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
|
||||
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
@ -995,6 +1021,7 @@ golang.org/x/sys v0.0.0-20200923182605-d9f96fdee20d/go.mod h1:h1NjWce9XRLGQEsW7w
|
||||
golang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20201113233024-12cec1faf1ba/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20201207223542-d4d67f95c62d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210112080510-489259a85091/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210119212857-b64e53b001e4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210303074136-134d130e1a04/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
@ -1004,9 +1031,13 @@ golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7w
|
||||
golang.org/x/sys v0.0.0-20210426230700-d19ff857e887/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210510120138-977fb7262007/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210601080250-7ecdf8ef093b/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210616094352-59db8d763f22/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210806184541-e5e7981a1069/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210816074244-15123e1e1f71/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210823070655-63515b42dcdf/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210831042530-f4d43177bf5e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 h1:7ZDGnxgHAMw7thfC5bEos0RDAccZKxioiWBhfIe+tvw=
|
||||
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||
|
||||
@ -20,6 +20,6 @@ func showBanner() {
|
||||
gologger.Print().Msgf("%s\n", banner)
|
||||
gologger.Print().Msgf("\t\tprojectdiscovery.io\n\n")
|
||||
|
||||
gologger.Error().Label("WRN").Msgf("Use with caution. You are responsible for your actions.\n")
|
||||
gologger.Error().Label("WRN").Msgf("Developers assume no liability and are not responsible for any misuse or damage.\n")
|
||||
gologger.Print().Label("WRN").Msgf("Use with caution. You are responsible for your actions.\n")
|
||||
gologger.Print().Label("WRN").Msgf("Developers assume no liability and are not responsible for any misuse or damage.\n")
|
||||
}
|
||||
|
||||
@ -2,12 +2,14 @@ package runner
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"errors"
|
||||
"net/url"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
|
||||
"github.com/go-playground/validator/v10"
|
||||
|
||||
"github.com/projectdiscovery/fileutil"
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/gologger/formatter"
|
||||
@ -24,7 +26,6 @@ func ParseOptions(options *types.Options) {
|
||||
|
||||
// Read the inputs and configure the logging
|
||||
configureOutput(options)
|
||||
|
||||
// Show the user the banner
|
||||
showBanner()
|
||||
|
||||
@ -47,13 +48,6 @@ func ParseOptions(options *types.Options) {
|
||||
gologger.Fatal().Msgf("Program exiting: %s\n", err)
|
||||
}
|
||||
|
||||
// Auto adjust rate limits when using headless mode if the user
|
||||
// hasn't specified any custom limits.
|
||||
if options.Headless && options.BulkSize == 25 && options.TemplateThreads == 10 {
|
||||
options.BulkSize = 2
|
||||
options.TemplateThreads = 2
|
||||
}
|
||||
|
||||
// Load the resolvers if user asked for them
|
||||
loadResolvers(options)
|
||||
|
||||
@ -73,56 +67,56 @@ func ParseOptions(options *types.Options) {
|
||||
|
||||
// hasStdin returns true if we have stdin input
|
||||
func hasStdin() bool {
|
||||
stat, err := os.Stdin.Stat()
|
||||
fi, err := os.Stdin.Stat()
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
isPipedFromChrDev := (stat.Mode() & os.ModeCharDevice) == 0
|
||||
isPipedFromFIFO := (stat.Mode() & os.ModeNamedPipe) != 0
|
||||
|
||||
return isPipedFromChrDev || isPipedFromFIFO
|
||||
if fi.Mode()&os.ModeNamedPipe == 0 {
|
||||
return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
// validateOptions validates the configuration options passed
|
||||
func validateOptions(options *types.Options) error {
|
||||
validate := validator.New()
|
||||
if err := validate.Struct(options); err != nil {
|
||||
if _, ok := err.(*validator.InvalidValidationError); ok {
|
||||
return err
|
||||
}
|
||||
errs := []string{}
|
||||
for _, err := range err.(validator.ValidationErrors) {
|
||||
errs = append(errs, err.Namespace()+": "+err.Tag())
|
||||
}
|
||||
return errors.Wrap(errors.New(strings.Join(errs, ", ")), "validation failed for these fields")
|
||||
}
|
||||
if options.Verbose && options.Silent {
|
||||
return errors.New("both verbose and silent mode specified")
|
||||
}
|
||||
|
||||
if err := validateProxyURL(options.ProxyURL, "invalid http proxy format (It should be http://username:password@host:port)"); err != nil {
|
||||
// loading the proxy server list from file or cli and test the connectivity
|
||||
if err := loadProxyServers(options); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if err := validateProxyURL(options.ProxySocksURL, "invalid socks proxy format (It should be socks5://username:password@host:port)"); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if options.Validate {
|
||||
options.Headless = true // required for correct validation of headless templates
|
||||
validateTemplatePaths(options.TemplatesDirectory, options.Templates, options.Workflows)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func validateProxyURL(proxyURL, message string) error {
|
||||
if proxyURL != "" && !isValidURL(proxyURL) {
|
||||
return errors.New(message)
|
||||
// Verify if any of the client certificate options were set since it requires all three to work properly
|
||||
if len(options.ClientCertFile) > 0 || len(options.ClientKeyFile) > 0 || len(options.ClientCAFile) > 0 {
|
||||
if len(options.ClientCertFile) == 0 || len(options.ClientKeyFile) == 0 || len(options.ClientCAFile) == 0 {
|
||||
return errors.New("if a client certification option is provided, then all three must be provided")
|
||||
}
|
||||
validateCertificatePaths([]string{options.ClientCertFile, options.ClientKeyFile, options.ClientCAFile})
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func isValidURL(urlString string) bool {
|
||||
_, err := url.Parse(urlString)
|
||||
return err == nil
|
||||
}
|
||||
|
||||
// configureOutput configures the output logging levels to be displayed on the screen
|
||||
func configureOutput(options *types.Options) {
|
||||
// If the user desires verbose output, show verbose output
|
||||
if options.Verbose {
|
||||
if options.Verbose || options.Validate {
|
||||
gologger.DefaultLogger.SetMaxLevel(levels.LevelVerbose)
|
||||
}
|
||||
if options.Debug {
|
||||
@ -164,7 +158,6 @@ func loadResolvers(options *types.Options) {
|
||||
|
||||
func validateTemplatePaths(templatesDirectory string, templatePaths, workflowPaths []string) {
|
||||
allGivenTemplatePaths := append(templatePaths, workflowPaths...)
|
||||
|
||||
for _, templatePath := range allGivenTemplatePaths {
|
||||
if templatesDirectory != templatePath && filepath.IsAbs(templatePath) {
|
||||
fileInfo, err := os.Stat(templatePath)
|
||||
@ -179,3 +172,14 @@ func validateTemplatePaths(templatesDirectory string, templatePaths, workflowPat
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func validateCertificatePaths(certificatePaths []string) {
|
||||
for _, certificatePath := range certificatePaths {
|
||||
if _, err := os.Stat(certificatePath); os.IsNotExist(err) {
|
||||
// The provided path to the PEM certificate does not exist for the client authentication. As this is
|
||||
// required for successful authentication, log and return an error
|
||||
gologger.Fatal().Msgf("The given path (%s) to the certificate does not exist!", certificatePath)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,81 +0,0 @@
|
||||
package runner
|
||||
|
||||
import (
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
"github.com/remeh/sizedwaitgroup"
|
||||
"go.uber.org/atomic"
|
||||
)
|
||||
|
||||
// processSelfContainedTemplates execute a self-contained template.
|
||||
func (r *Runner) processSelfContainedTemplates(template *templates.Template) bool {
|
||||
match, err := template.Executer.Execute("")
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", r.colorizer.BrightBlue(template.ID), err)
|
||||
}
|
||||
return match
|
||||
}
|
||||
|
||||
// processTemplateWithList execute a template against the list of user provided targets
|
||||
func (r *Runner) processTemplateWithList(template *templates.Template) bool {
|
||||
results := &atomic.Bool{}
|
||||
wg := sizedwaitgroup.New(r.options.BulkSize)
|
||||
processItem := func(k, _ []byte) error {
|
||||
URL := string(k)
|
||||
|
||||
// Skip if the host has had errors
|
||||
if r.hostErrors != nil && r.hostErrors.Check(URL) {
|
||||
return nil
|
||||
}
|
||||
wg.Add()
|
||||
go func(URL string) {
|
||||
defer wg.Done()
|
||||
|
||||
match, err := template.Executer.Execute(URL)
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", r.colorizer.BrightBlue(template.ID), err)
|
||||
}
|
||||
results.CAS(false, match)
|
||||
}(URL)
|
||||
return nil
|
||||
}
|
||||
if r.options.Stream {
|
||||
_ = r.hostMapStream.Scan(processItem)
|
||||
} else {
|
||||
r.hostMap.Scan(processItem)
|
||||
}
|
||||
|
||||
wg.Wait()
|
||||
return results.Load()
|
||||
}
|
||||
|
||||
// processTemplateWithList process a template on the URL list
|
||||
func (r *Runner) processWorkflowWithList(template *templates.Template) bool {
|
||||
results := &atomic.Bool{}
|
||||
wg := sizedwaitgroup.New(r.options.BulkSize)
|
||||
|
||||
processItem := func(k, _ []byte) error {
|
||||
URL := string(k)
|
||||
|
||||
// Skip if the host has had errors
|
||||
if r.hostErrors != nil && r.hostErrors.Check(URL) {
|
||||
return nil
|
||||
}
|
||||
wg.Add()
|
||||
go func(URL string) {
|
||||
defer wg.Done()
|
||||
match := template.CompiledWorkflow.RunWorkflow(URL)
|
||||
results.CAS(false, match)
|
||||
}(URL)
|
||||
return nil
|
||||
}
|
||||
|
||||
if r.options.Stream {
|
||||
_ = r.hostMapStream.Scan(processItem)
|
||||
} else {
|
||||
r.hostMap.Scan(processItem)
|
||||
}
|
||||
|
||||
wg.Wait()
|
||||
return results.Load()
|
||||
}
|
||||
123
v2/internal/runner/proxy.go
Normal file
123
v2/internal/runner/proxy.go
Normal file
@ -0,0 +1,123 @@
|
||||
package runner
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"errors"
|
||||
"fmt"
|
||||
"net"
|
||||
"net/url"
|
||||
"os"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/projectdiscovery/fileutil"
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
var proxyURLList []url.URL
|
||||
|
||||
// loadProxyServers load list of proxy servers from file or comma seperated
|
||||
func loadProxyServers(options *types.Options) error {
|
||||
if len(options.Proxy) == 0 {
|
||||
return nil
|
||||
}
|
||||
for _, p := range options.Proxy {
|
||||
if proxyURL, err := validateProxyURL(p); err == nil {
|
||||
proxyURLList = append(proxyURLList, proxyURL)
|
||||
} else if fileutil.FileExists(p) {
|
||||
file, err := os.Open(p)
|
||||
if err != nil {
|
||||
return fmt.Errorf("could not open proxy file: %w", err)
|
||||
}
|
||||
defer file.Close()
|
||||
scanner := bufio.NewScanner(file)
|
||||
for scanner.Scan() {
|
||||
proxy := scanner.Text()
|
||||
if strings.TrimSpace(proxy) == "" {
|
||||
continue
|
||||
}
|
||||
if proxyURL, err := validateProxyURL(proxy); err != nil {
|
||||
return err
|
||||
} else {
|
||||
proxyURLList = append(proxyURLList, proxyURL)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return fmt.Errorf("invalid proxy file or URL provided for %s", p)
|
||||
}
|
||||
}
|
||||
return processProxyList(options)
|
||||
}
|
||||
|
||||
func processProxyList(options *types.Options) error {
|
||||
if len(proxyURLList) == 0 {
|
||||
return fmt.Errorf("could not find any valid proxy")
|
||||
} else {
|
||||
done := make(chan bool)
|
||||
exitCounter := make(chan bool)
|
||||
counter := 0
|
||||
for _, url := range proxyURLList {
|
||||
go runProxyConnectivity(url, options, done, exitCounter)
|
||||
}
|
||||
for {
|
||||
select {
|
||||
case <-done:
|
||||
{
|
||||
close(done)
|
||||
return nil
|
||||
}
|
||||
case <-exitCounter:
|
||||
{
|
||||
if counter += 1; counter == len(proxyURLList) {
|
||||
return errors.New("no reachable proxy found")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func runProxyConnectivity(proxyURL url.URL, options *types.Options, done chan bool, exitCounter chan bool) {
|
||||
if err := testProxyConnection(proxyURL, options.Timeout); err == nil {
|
||||
if types.ProxyURL == "" && types.ProxySocksURL == "" {
|
||||
assignProxyURL(proxyURL, options)
|
||||
done <- true
|
||||
}
|
||||
}
|
||||
exitCounter <- true
|
||||
}
|
||||
|
||||
func testProxyConnection(proxyURL url.URL, timeoutDelay int) error {
|
||||
timeout := time.Duration(timeoutDelay) * time.Second
|
||||
_, err := net.DialTimeout("tcp", fmt.Sprintf("%s:%s", proxyURL.Hostname(), proxyURL.Port()), timeout)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func assignProxyURL(proxyURL url.URL, options *types.Options) {
|
||||
os.Setenv(types.HTTP_PROXY_ENV, proxyURL.String())
|
||||
if proxyURL.Scheme == types.HTTP || proxyURL.Scheme == types.HTTPS {
|
||||
types.ProxyURL = proxyURL.String()
|
||||
types.ProxySocksURL = ""
|
||||
gologger.Verbose().Msgf("Using %s as proxy server", proxyURL.String())
|
||||
} else if proxyURL.Scheme == types.SOCKS5 {
|
||||
types.ProxyURL = ""
|
||||
types.ProxySocksURL = proxyURL.String()
|
||||
gologger.Verbose().Msgf("Using %s as socket proxy server", proxyURL.String())
|
||||
}
|
||||
}
|
||||
|
||||
func validateProxyURL(proxy string) (url.URL, error) {
|
||||
if url, err := url.Parse(proxy); err == nil && isSupportedProtocol(url.Scheme) {
|
||||
return *url, nil
|
||||
}
|
||||
return url.URL{}, errors.New("invalid proxy format (It should be http[s]/socks5://[username:password@]host:port)")
|
||||
}
|
||||
|
||||
// isSupportedProtocol checks given protocols are supported
|
||||
func isSupportedProtocol(value string) bool {
|
||||
return value == types.HTTP || value == types.HTTPS || value == types.SOCKS5
|
||||
}
|
||||
@ -2,7 +2,6 @@ package runner
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
@ -10,27 +9,21 @@ import (
|
||||
|
||||
"github.com/logrusorgru/aurora"
|
||||
"github.com/pkg/errors"
|
||||
"github.com/remeh/sizedwaitgroup"
|
||||
"github.com/rs/xid"
|
||||
"go.uber.org/atomic"
|
||||
"go.uber.org/ratelimit"
|
||||
"gopkg.in/yaml.v2"
|
||||
|
||||
"github.com/projectdiscovery/filekv"
|
||||
"github.com/projectdiscovery/fileutil"
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/hmap/store/hybrid"
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/colorizer"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/core"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/core/inputs/hybrid"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/output"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/projectfile"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/clusterer"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/hosterrorscache"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/interactsh"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolinit"
|
||||
@ -42,15 +35,13 @@ import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils/stats"
|
||||
yamlwrapper "github.com/projectdiscovery/nuclei/v2/pkg/utils/yaml"
|
||||
)
|
||||
|
||||
// Runner is a client for running the enumeration process.
|
||||
type Runner struct {
|
||||
hostMap *hybrid.HybridMap
|
||||
hostMapStream *filekv.FileDB
|
||||
output output.Writer
|
||||
interactsh *interactsh.Client
|
||||
inputCount int64
|
||||
templatesConfig *config.Config
|
||||
options *types.Options
|
||||
projectFile *projectfile.ProjectFile
|
||||
@ -59,6 +50,7 @@ type Runner struct {
|
||||
colorizer aurora.Aurora
|
||||
issuesClient *reporting.Client
|
||||
addColor func(severity.Severity) string
|
||||
hmapInputProvider *hybrid.Input
|
||||
browser *engine.Browser
|
||||
ratelimiter ratelimit.Limiter
|
||||
hostErrors *hosterrorscache.Cache
|
||||
@ -77,11 +69,16 @@ func New(options *types.Options) (*Runner, error) {
|
||||
}
|
||||
if options.Validate {
|
||||
parsers.ShouldValidate = true
|
||||
// Does not update the templates when validate flag is used
|
||||
options.NoUpdateTemplates = true
|
||||
}
|
||||
if err := runner.updateTemplates(); err != nil {
|
||||
gologger.Warning().Msgf("Could not update templates: %s\n", err)
|
||||
}
|
||||
if options.Headless {
|
||||
if engine.MustDisableSandbox() {
|
||||
gologger.Warning().Msgf("The current platform and privileged user will run the browser without sandbox\n")
|
||||
}
|
||||
browser, err := engine.New(options)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
@ -116,106 +113,16 @@ func New(options *types.Options) (*Runner, error) {
|
||||
if (len(options.Templates) == 0 || !options.NewTemplates || (options.TargetsFilePath == "" && !options.Stdin && len(options.Targets) == 0)) && options.UpdateTemplates {
|
||||
os.Exit(0)
|
||||
}
|
||||
hm, err := hybrid.New(hybrid.DefaultDiskOptions)
|
||||
|
||||
// Initialize the input source
|
||||
hmapInput, err := hybrid.New(options)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary input file")
|
||||
}
|
||||
runner.hostMap = hm
|
||||
|
||||
if options.Stream {
|
||||
fkvOptions := filekv.DefaultOptions
|
||||
if tmpFileName, err := fileutil.GetTempFileName(); err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary input file")
|
||||
} else {
|
||||
fkvOptions.Path = tmpFileName
|
||||
}
|
||||
fkv, err := filekv.Open(fkvOptions)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary unsorted input file")
|
||||
}
|
||||
runner.hostMapStream = fkv
|
||||
}
|
||||
|
||||
runner.inputCount = 0
|
||||
dupeCount := 0
|
||||
|
||||
// Handle multiple targets
|
||||
if len(options.Targets) != 0 {
|
||||
for _, target := range options.Targets {
|
||||
url := strings.TrimSpace(target)
|
||||
if url == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
if _, ok := runner.hostMap.Get(url); ok {
|
||||
dupeCount++
|
||||
continue
|
||||
}
|
||||
|
||||
runner.inputCount++
|
||||
// nolint:errcheck // ignoring error
|
||||
runner.hostMap.Set(url, nil)
|
||||
if options.Stream {
|
||||
_ = runner.hostMapStream.Set([]byte(url), nil)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle stdin
|
||||
if options.Stdin {
|
||||
scanner := bufio.NewScanner(os.Stdin)
|
||||
for scanner.Scan() {
|
||||
url := strings.TrimSpace(scanner.Text())
|
||||
if url == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
if _, ok := runner.hostMap.Get(url); ok {
|
||||
dupeCount++
|
||||
continue
|
||||
}
|
||||
|
||||
runner.inputCount++
|
||||
// nolint:errcheck // ignoring error
|
||||
runner.hostMap.Set(url, nil)
|
||||
if options.Stream {
|
||||
_ = runner.hostMapStream.Set([]byte(url), nil)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle target file
|
||||
if options.TargetsFilePath != "" {
|
||||
input, inputErr := os.Open(options.TargetsFilePath)
|
||||
if inputErr != nil {
|
||||
return nil, errors.Wrap(inputErr, "could not open targets file")
|
||||
}
|
||||
scanner := bufio.NewScanner(input)
|
||||
for scanner.Scan() {
|
||||
url := strings.TrimSpace(scanner.Text())
|
||||
if url == "" {
|
||||
continue
|
||||
}
|
||||
if _, ok := runner.hostMap.Get(url); ok {
|
||||
dupeCount++
|
||||
continue
|
||||
}
|
||||
runner.inputCount++
|
||||
// nolint:errcheck // ignoring error
|
||||
runner.hostMap.Set(url, nil)
|
||||
if options.Stream {
|
||||
_ = runner.hostMapStream.Set([]byte(url), nil)
|
||||
}
|
||||
}
|
||||
input.Close()
|
||||
}
|
||||
|
||||
if dupeCount > 0 {
|
||||
gologger.Info().Msgf("Supplied input was automatically deduplicated (%d removed).", dupeCount)
|
||||
return nil, errors.Wrap(err, "could not create input provider")
|
||||
}
|
||||
runner.hmapInputProvider = hmapInput
|
||||
|
||||
// Create the output file if asked
|
||||
outputWriter, err := output.NewStandardWriter(!options.NoColor, options.NoMeta, options.NoTimestamp, options.JSON, options.JSONRequests, options.Output, options.TraceLogFile)
|
||||
outputWriter, err := output.NewStandardWriter(!options.NoColor, options.NoMeta, options.NoTimestamp, options.JSON, options.JSONRequests, options.MatcherStatus, options.Output, options.TraceLogFile, options.ErrorLogFile)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create output file")
|
||||
}
|
||||
@ -243,25 +150,22 @@ func New(options *types.Options) (*Runner, error) {
|
||||
}
|
||||
}
|
||||
|
||||
if !options.NoInteractsh {
|
||||
interactshClient, err := interactsh.New(&interactsh.Options{
|
||||
ServerURL: options.InteractshURL,
|
||||
Authorization: options.InteractshToken,
|
||||
CacheSize: int64(options.InteractionsCacheSize),
|
||||
Eviction: time.Duration(options.InteractionsEviction) * time.Second,
|
||||
ColldownPeriod: time.Duration(options.InteractionsColldownPeriod) * time.Second,
|
||||
PollDuration: time.Duration(options.InteractionsPollDuration) * time.Second,
|
||||
Output: runner.output,
|
||||
IssuesClient: runner.issuesClient,
|
||||
Progress: runner.progress,
|
||||
Debug: runner.options.Debug,
|
||||
})
|
||||
opts := interactsh.NewDefaultOptions(runner.output, runner.issuesClient, runner.progress)
|
||||
opts.Debug = runner.options.Debug
|
||||
opts.ServerURL = options.InteractshURL
|
||||
opts.Authorization = options.InteractshToken
|
||||
opts.CacheSize = int64(options.InteractionsCacheSize)
|
||||
opts.Eviction = time.Duration(options.InteractionsEviction) * time.Second
|
||||
opts.ColldownPeriod = time.Duration(options.InteractionsCoolDownPeriod) * time.Second
|
||||
opts.PollDuration = time.Duration(options.InteractionsPollDuration) * time.Second
|
||||
opts.NoInteractsh = runner.options.NoInteractsh
|
||||
|
||||
interactshClient, err := interactsh.New(opts)
|
||||
if err != nil {
|
||||
gologger.Error().Msgf("Could not create interactsh client: %s", err)
|
||||
} else {
|
||||
runner.interactsh = interactshClient
|
||||
}
|
||||
}
|
||||
|
||||
if options.RateLimitMinute > 0 {
|
||||
runner.ratelimiter = ratelimit.New(options.RateLimitMinute, ratelimit.Per(60*time.Second))
|
||||
@ -282,9 +186,9 @@ func createReportingOptions(options *types.Options) (*reporting.Options, error)
|
||||
}
|
||||
|
||||
reportingOptions = &reporting.Options{}
|
||||
if parseErr := yaml.NewDecoder(file).Decode(reportingOptions); parseErr != nil {
|
||||
if err := yamlwrapper.DecodeAndValidate(file, reportingOptions); err != nil {
|
||||
file.Close()
|
||||
return nil, errors.Wrap(parseErr, "could not parse reporting config file")
|
||||
return nil, errors.Wrap(err, "could not parse reporting config file")
|
||||
}
|
||||
file.Close()
|
||||
}
|
||||
@ -312,13 +216,10 @@ func (r *Runner) Close() {
|
||||
if r.output != nil {
|
||||
r.output.Close()
|
||||
}
|
||||
r.hostMap.Close()
|
||||
if r.projectFile != nil {
|
||||
r.projectFile.Close()
|
||||
}
|
||||
if r.options.Stream {
|
||||
r.hostMapStream.Close()
|
||||
}
|
||||
r.hmapInputProvider.Close()
|
||||
protocolinit.Close()
|
||||
}
|
||||
|
||||
@ -335,15 +236,20 @@ func (r *Runner) RunEnumeration() error {
|
||||
}
|
||||
r.options.Templates = append(r.options.Templates, templatesLoaded...)
|
||||
}
|
||||
// Exclude ignored file for validation
|
||||
if !r.options.Validate {
|
||||
ignoreFile := config.ReadIgnoreFile()
|
||||
r.options.ExcludeTags = append(r.options.ExcludeTags, ignoreFile.Tags...)
|
||||
r.options.ExcludedTemplates = append(r.options.ExcludedTemplates, ignoreFile.Files...)
|
||||
|
||||
}
|
||||
var cache *hosterrorscache.Cache
|
||||
if r.options.MaxHostError > 0 {
|
||||
cache = hosterrorscache.New(r.options.MaxHostError, hosterrorscache.DefaultMaxHostsCount).SetVerbose(r.options.Verbose)
|
||||
}
|
||||
r.hostErrors = cache
|
||||
|
||||
// Create the executer options which will be used throughout the execution
|
||||
// stage by the nuclei engine modules.
|
||||
executerOpts := protocols.ExecuterOptions{
|
||||
Output: r.output,
|
||||
Options: r.options,
|
||||
@ -355,31 +261,18 @@ func (r *Runner) RunEnumeration() error {
|
||||
ProjectFile: r.projectFile,
|
||||
Browser: r.browser,
|
||||
HostErrorsCache: cache,
|
||||
Colorizer: r.colorizer,
|
||||
}
|
||||
engine := core.New(r.options)
|
||||
engine.SetExecuterOptions(executerOpts)
|
||||
|
||||
workflowLoader, err := parsers.NewLoader(&executerOpts)
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "Could not create loader.")
|
||||
}
|
||||
|
||||
executerOpts.WorkflowLoader = workflowLoader
|
||||
|
||||
loaderConfig := loader.Config{
|
||||
Templates: r.options.Templates,
|
||||
Workflows: r.options.Workflows,
|
||||
ExcludeTemplates: r.options.ExcludedTemplates,
|
||||
Tags: r.options.Tags,
|
||||
ExcludeTags: r.options.ExcludeTags,
|
||||
IncludeTemplates: r.options.IncludeTemplates,
|
||||
Authors: r.options.Author,
|
||||
Severities: r.options.Severities,
|
||||
ExcludeSeverities: r.options.ExcludeSeverities,
|
||||
IncludeTags: r.options.IncludeTags,
|
||||
TemplatesDirectory: r.options.TemplatesDirectory,
|
||||
Catalog: r.catalog,
|
||||
ExecutorOptions: executerOpts,
|
||||
}
|
||||
store, err := loader.New(&loaderConfig)
|
||||
store, err := loader.New(loader.NewConfig(r.options, r.catalog, executerOpts))
|
||||
if err != nil {
|
||||
return errors.Wrap(err, "could not load templates from config")
|
||||
}
|
||||
@ -389,7 +282,7 @@ func (r *Runner) RunEnumeration() error {
|
||||
if err := store.ValidateTemplates(r.options.Templates, r.options.Workflows); err != nil {
|
||||
return err
|
||||
}
|
||||
if stats.GetValue(parsers.SyntaxErrorStats) == 0 && stats.GetValue(parsers.SyntaxWarningStats) == 0 {
|
||||
if stats.GetValue(parsers.SyntaxErrorStats) == 0 && stats.GetValue(parsers.SyntaxWarningStats) == 0 && stats.GetValue(parsers.RuntimeWarningsStats) == 0 {
|
||||
gologger.Info().Msgf("All templates validated successfully\n")
|
||||
} else {
|
||||
return errors.New("encountered errors while performing template validation")
|
||||
@ -397,9 +290,82 @@ func (r *Runner) RunEnumeration() error {
|
||||
return nil // exit
|
||||
}
|
||||
|
||||
r.displayExecutionInfo(store)
|
||||
|
||||
var unclusteredRequests int64
|
||||
for _, template := range store.Templates() {
|
||||
// workflows will dynamically adjust the totals while running, as
|
||||
// it can't be known in advance which requests will be called
|
||||
if len(template.Workflows) > 0 {
|
||||
continue
|
||||
}
|
||||
unclusteredRequests += int64(template.TotalRequests) * r.hmapInputProvider.Count()
|
||||
}
|
||||
|
||||
if r.options.VerboseVerbose {
|
||||
for _, template := range store.Templates() {
|
||||
r.logAvailableTemplate(template.Path)
|
||||
}
|
||||
for _, template := range store.Workflows() {
|
||||
r.logAvailableTemplate(template.Path)
|
||||
}
|
||||
}
|
||||
|
||||
// Cluster the templates first because we want info on how many
|
||||
// templates did we cluster for showing to user in CLI
|
||||
originalTemplatesCount := len(store.Templates())
|
||||
finalTemplates, clusterCount := templates.ClusterTemplates(store.Templates(), engine.ExecuterOptions())
|
||||
finalTemplates = append(finalTemplates, store.Workflows()...)
|
||||
|
||||
var totalRequests int64
|
||||
for _, t := range finalTemplates {
|
||||
if len(t.Workflows) > 0 {
|
||||
continue
|
||||
}
|
||||
totalRequests += int64(t.TotalRequests) * r.hmapInputProvider.Count()
|
||||
}
|
||||
if totalRequests < unclusteredRequests {
|
||||
gologger.Info().Msgf("Templates clustered: %d (Reduced %d HTTP Requests)", clusterCount, unclusteredRequests-totalRequests)
|
||||
}
|
||||
workflowCount := len(store.Workflows())
|
||||
templateCount := originalTemplatesCount + workflowCount
|
||||
|
||||
// 0 matches means no templates were found in directory
|
||||
if templateCount == 0 {
|
||||
return errors.New("no valid templates were found")
|
||||
}
|
||||
|
||||
// tracks global progress and captures stdout/stderr until p.Wait finishes
|
||||
r.progress.Init(r.hmapInputProvider.Count(), templateCount, totalRequests)
|
||||
|
||||
results := engine.ExecuteWithOpts(finalTemplates, r.hmapInputProvider, true)
|
||||
|
||||
if r.interactsh != nil {
|
||||
matched := r.interactsh.Close()
|
||||
if matched {
|
||||
results.CAS(false, true)
|
||||
}
|
||||
}
|
||||
r.progress.Stop()
|
||||
|
||||
if r.issuesClient != nil {
|
||||
r.issuesClient.Close()
|
||||
}
|
||||
if !results.Load() {
|
||||
gologger.Info().Msgf("No results found. Better luck next time!")
|
||||
}
|
||||
if r.browser != nil {
|
||||
r.browser.Close()
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// displayExecutionInfo displays misc info about the nuclei engine execution
|
||||
func (r *Runner) displayExecutionInfo(store *loader.Store) {
|
||||
// Display stats for any loaded templates' syntax warnings or errors
|
||||
stats.Display(parsers.SyntaxWarningStats)
|
||||
stats.Display(parsers.SyntaxErrorStats)
|
||||
stats.Display(parsers.RuntimeWarningsStats)
|
||||
|
||||
builder := &strings.Builder{}
|
||||
if r.templatesConfig != nil && r.templatesConfig.NucleiLatestVersion != "" {
|
||||
@ -445,128 +411,6 @@ func (r *Runner) RunEnumeration() error {
|
||||
if len(store.Workflows()) > 0 {
|
||||
gologger.Info().Msgf("Workflows loaded for scan: %d", len(store.Workflows()))
|
||||
}
|
||||
|
||||
// pre-parse all the templates, apply filters
|
||||
finalTemplates := []*templates.Template{}
|
||||
|
||||
var unclusteredRequests int64
|
||||
for _, template := range store.Templates() {
|
||||
// workflows will dynamically adjust the totals while running, as
|
||||
// it can't be known in advance which requests will be called
|
||||
if len(template.Workflows) > 0 {
|
||||
continue
|
||||
}
|
||||
unclusteredRequests += int64(template.TotalRequests) * r.inputCount
|
||||
}
|
||||
|
||||
if r.options.VerboseVerbose {
|
||||
for _, template := range store.Templates() {
|
||||
r.logAvailableTemplate(template.Path)
|
||||
}
|
||||
for _, template := range store.Workflows() {
|
||||
r.logAvailableTemplate(template.Path)
|
||||
}
|
||||
}
|
||||
templatesMap := make(map[string]*templates.Template)
|
||||
for _, v := range store.Templates() {
|
||||
templatesMap[v.Path] = v
|
||||
}
|
||||
originalTemplatesCount := len(store.Templates())
|
||||
clusterCount := 0
|
||||
clusters := clusterer.Cluster(templatesMap)
|
||||
for _, cluster := range clusters {
|
||||
if len(cluster) > 1 && !r.options.OfflineHTTP {
|
||||
executerOpts := protocols.ExecuterOptions{
|
||||
Output: r.output,
|
||||
Options: r.options,
|
||||
Progress: r.progress,
|
||||
Catalog: r.catalog,
|
||||
RateLimiter: r.ratelimiter,
|
||||
IssuesClient: r.issuesClient,
|
||||
Browser: r.browser,
|
||||
ProjectFile: r.projectFile,
|
||||
Interactsh: r.interactsh,
|
||||
HostErrorsCache: cache,
|
||||
}
|
||||
clusterID := fmt.Sprintf("cluster-%s", xid.New().String())
|
||||
|
||||
finalTemplates = append(finalTemplates, &templates.Template{
|
||||
ID: clusterID,
|
||||
RequestsHTTP: cluster[0].RequestsHTTP,
|
||||
Executer: clusterer.NewExecuter(cluster, &executerOpts),
|
||||
TotalRequests: len(cluster[0].RequestsHTTP),
|
||||
})
|
||||
clusterCount += len(cluster)
|
||||
} else {
|
||||
finalTemplates = append(finalTemplates, cluster...)
|
||||
}
|
||||
}
|
||||
|
||||
finalTemplates = append(finalTemplates, store.Workflows()...)
|
||||
|
||||
var totalRequests int64
|
||||
for _, t := range finalTemplates {
|
||||
if len(t.Workflows) > 0 {
|
||||
continue
|
||||
}
|
||||
totalRequests += int64(t.TotalRequests) * r.inputCount
|
||||
}
|
||||
if totalRequests < unclusteredRequests {
|
||||
gologger.Info().Msgf("Templates clustered: %d (Reduced %d HTTP Requests)", clusterCount, unclusteredRequests-totalRequests)
|
||||
}
|
||||
workflowCount := len(store.Workflows())
|
||||
templateCount := originalTemplatesCount + workflowCount
|
||||
|
||||
// 0 matches means no templates were found in directory
|
||||
if templateCount == 0 {
|
||||
return errors.New("no valid templates were found")
|
||||
}
|
||||
|
||||
/*
|
||||
TODO does it make sense to run the logic below if there are no targets specified?
|
||||
Can we safely assume the user is just experimenting with the template/workflow filters before running them?
|
||||
*/
|
||||
|
||||
results := &atomic.Bool{}
|
||||
wgtemplates := sizedwaitgroup.New(r.options.TemplateThreads)
|
||||
|
||||
// tracks global progress and captures stdout/stderr until p.Wait finishes
|
||||
r.progress.Init(r.inputCount, templateCount, totalRequests)
|
||||
|
||||
for _, t := range finalTemplates {
|
||||
wgtemplates.Add()
|
||||
go func(template *templates.Template) {
|
||||
defer wgtemplates.Done()
|
||||
|
||||
if template.SelfContained {
|
||||
results.CAS(false, r.processSelfContainedTemplates(template))
|
||||
} else if len(template.Workflows) > 0 {
|
||||
results.CAS(false, r.processWorkflowWithList(template))
|
||||
} else {
|
||||
results.CAS(false, r.processTemplateWithList(template))
|
||||
}
|
||||
}(t)
|
||||
}
|
||||
wgtemplates.Wait()
|
||||
|
||||
if r.interactsh != nil {
|
||||
matched := r.interactsh.Close()
|
||||
if matched {
|
||||
results.CAS(false, true)
|
||||
}
|
||||
}
|
||||
r.progress.Stop()
|
||||
|
||||
if r.issuesClient != nil {
|
||||
r.issuesClient.Close()
|
||||
}
|
||||
if !results.Load() {
|
||||
gologger.Info().Msgf("No results found. Better luck next time!")
|
||||
}
|
||||
if r.browser != nil {
|
||||
r.browser.Close()
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// readNewTemplatesFile reads newly added templates from directory if it exists
|
||||
|
||||
@ -54,7 +54,7 @@ func (r *Runner) updateTemplates() error { // TODO this method does more than ju
|
||||
return err
|
||||
}
|
||||
configDir := filepath.Join(home, ".config", "nuclei")
|
||||
_ = os.MkdirAll(configDir, os.ModePerm)
|
||||
_ = os.MkdirAll(configDir, 0755)
|
||||
|
||||
if err := r.readInternalConfigurationFile(home, configDir); err != nil {
|
||||
return errors.Wrap(err, "could not read configuration file")
|
||||
@ -242,12 +242,12 @@ func (r *Runner) getLatestReleaseFromGithub(latestTag string) (*github.Repositor
|
||||
func (r *Runner) downloadReleaseAndUnzip(ctx context.Context, version, downloadURL string) (*templateUpdateResults, error) {
|
||||
req, err := http.NewRequestWithContext(ctx, http.MethodGet, downloadURL, nil)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create HTTP request to %s: %s", downloadURL, err)
|
||||
return nil, fmt.Errorf("failed to create HTTP request to %s: %w", downloadURL, err)
|
||||
}
|
||||
|
||||
res, err := http.DefaultClient.Do(req)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to download a release file from %s: %s", downloadURL, err)
|
||||
return nil, fmt.Errorf("failed to download a release file from %s: %w", downloadURL, err)
|
||||
}
|
||||
defer res.Body.Close()
|
||||
if res.StatusCode != http.StatusOK {
|
||||
@ -256,23 +256,23 @@ func (r *Runner) downloadReleaseAndUnzip(ctx context.Context, version, downloadU
|
||||
|
||||
buf, err := ioutil.ReadAll(res.Body)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create buffer for zip file: %s", err)
|
||||
return nil, fmt.Errorf("failed to create buffer for zip file: %w", err)
|
||||
}
|
||||
|
||||
reader := bytes.NewReader(buf)
|
||||
zipReader, err := zip.NewReader(reader, reader.Size())
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to uncompress zip file: %s", err)
|
||||
return nil, fmt.Errorf("failed to uncompress zip file: %w", err)
|
||||
}
|
||||
|
||||
// Create the template folder if it doesn't exist
|
||||
if err := os.MkdirAll(r.templatesConfig.TemplatesDirectory, os.ModePerm); err != nil {
|
||||
return nil, fmt.Errorf("failed to create template base folder: %s", err)
|
||||
if err := os.MkdirAll(r.templatesConfig.TemplatesDirectory, 0755); err != nil {
|
||||
return nil, fmt.Errorf("failed to create template base folder: %w", err)
|
||||
}
|
||||
|
||||
results, err := r.compareAndWriteTemplates(zipReader)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to write templates: %s", err)
|
||||
return nil, fmt.Errorf("failed to write templates: %w", err)
|
||||
}
|
||||
|
||||
if r.options.Verbose {
|
||||
@ -291,7 +291,7 @@ func (r *Runner) downloadReleaseAndUnzip(ctx context.Context, version, downloadU
|
||||
buffer.WriteString("\n")
|
||||
}
|
||||
|
||||
if err := ioutil.WriteFile(additionsFile, buffer.Bytes(), os.ModePerm); err != nil {
|
||||
if err := ioutil.WriteFile(additionsFile, buffer.Bytes(), 0644); err != nil {
|
||||
return nil, errors.Wrap(err, "could not write new additions file")
|
||||
}
|
||||
return results, err
|
||||
@ -316,60 +316,42 @@ func (r *Runner) compareAndWriteTemplates(zipReader *zip.Reader) (*templateUpdat
|
||||
// If the path isn't found in new update after being read from the previous checksum,
|
||||
// it is removed. This allows us fine-grained control over the download process
|
||||
// as well as solves a long problem with nuclei-template updates.
|
||||
checksumFile := filepath.Join(r.templatesConfig.TemplatesDirectory, ".checksum")
|
||||
configuredTemplateDirectory := r.templatesConfig.TemplatesDirectory
|
||||
checksumFile := filepath.Join(configuredTemplateDirectory, ".checksum")
|
||||
templateChecksumsMap, _ := createTemplateChecksumsMap(checksumFile)
|
||||
for _, zipTemplateFile := range zipReader.File {
|
||||
directory, name := filepath.Split(zipTemplateFile.Name)
|
||||
if name == "" {
|
||||
templateAbsolutePath, skipFile, err := calculateTemplateAbsolutePath(zipTemplateFile.Name, configuredTemplateDirectory)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if skipFile {
|
||||
continue
|
||||
}
|
||||
paths := strings.Split(directory, string(os.PathSeparator))
|
||||
finalPath := filepath.Join(paths[1:]...)
|
||||
|
||||
if strings.HasPrefix(name, ".") || strings.HasPrefix(finalPath, ".") || strings.EqualFold(name, "README.md") {
|
||||
continue
|
||||
}
|
||||
results.totalCount++
|
||||
templateDirectory := filepath.Join(r.templatesConfig.TemplatesDirectory, finalPath)
|
||||
if err := os.MkdirAll(templateDirectory, os.ModePerm); err != nil {
|
||||
return nil, fmt.Errorf("failed to create template folder %s : %s", templateDirectory, err)
|
||||
}
|
||||
|
||||
templatePath := filepath.Join(templateDirectory, name)
|
||||
|
||||
isAddition := false
|
||||
if _, statErr := os.Stat(templatePath); os.IsNotExist(statErr) {
|
||||
if _, statErr := os.Stat(templateAbsolutePath); os.IsNotExist(statErr) {
|
||||
isAddition = true
|
||||
}
|
||||
templateFile, err := os.OpenFile(templatePath, os.O_TRUNC|os.O_CREATE|os.O_WRONLY, 0777)
|
||||
|
||||
newTemplateChecksum, err := writeUnZippedTemplateFile(templateAbsolutePath, zipTemplateFile)
|
||||
if err != nil {
|
||||
templateFile.Close()
|
||||
return nil, fmt.Errorf("could not create uncompressed file: %s", err)
|
||||
return nil, err
|
||||
}
|
||||
|
||||
zipTemplateFileReader, err := zipTemplateFile.Open()
|
||||
oldTemplateChecksum, checksumOk := templateChecksumsMap[templateAbsolutePath]
|
||||
|
||||
relativeTemplatePath, err := filepath.Rel(configuredTemplateDirectory, templateAbsolutePath)
|
||||
if err != nil {
|
||||
templateFile.Close()
|
||||
return nil, fmt.Errorf("could not open archive to extract file: %s", err)
|
||||
return nil, fmt.Errorf("could not calculate relative path for template: %s. %w", templateAbsolutePath, err)
|
||||
}
|
||||
hasher := md5.New()
|
||||
|
||||
// Save file and also read into hasher for md5
|
||||
if _, err := io.Copy(templateFile, io.TeeReader(zipTemplateFileReader, hasher)); err != nil {
|
||||
templateFile.Close()
|
||||
return nil, fmt.Errorf("could not write template file: %s", err)
|
||||
}
|
||||
templateFile.Close()
|
||||
|
||||
oldChecksum, checksumOK := templateChecksumsMap[templatePath]
|
||||
|
||||
checksum := hex.EncodeToString(hasher.Sum(nil))
|
||||
if isAddition {
|
||||
results.additions = append(results.additions, filepath.Join(finalPath, name))
|
||||
} else if checksumOK && oldChecksum[0] != checksum {
|
||||
results.modifications = append(results.modifications, filepath.Join(finalPath, name))
|
||||
results.additions = append(results.additions, relativeTemplatePath)
|
||||
} else if checksumOk && oldTemplateChecksum[0] != newTemplateChecksum {
|
||||
results.modifications = append(results.modifications, relativeTemplatePath)
|
||||
}
|
||||
results.checksums[templatePath] = checksum
|
||||
results.checksums[templateAbsolutePath] = newTemplateChecksum
|
||||
results.totalCount++
|
||||
}
|
||||
|
||||
// If we don't find the previous file in the newly downloaded list,
|
||||
@ -378,12 +360,63 @@ func (r *Runner) compareAndWriteTemplates(zipReader *zip.Reader) (*templateUpdat
|
||||
_, ok := results.checksums[templatePath]
|
||||
if !ok && templateChecksums[0] == templateChecksums[1] {
|
||||
_ = os.Remove(templatePath)
|
||||
results.deletions = append(results.deletions, strings.TrimPrefix(strings.TrimPrefix(templatePath, r.templatesConfig.TemplatesDirectory), string(os.PathSeparator)))
|
||||
results.deletions = append(results.deletions, strings.TrimPrefix(strings.TrimPrefix(templatePath, configuredTemplateDirectory), string(os.PathSeparator)))
|
||||
}
|
||||
}
|
||||
return results, nil
|
||||
}
|
||||
|
||||
func writeUnZippedTemplateFile(templateAbsolutePath string, zipTemplateFile *zip.File) (string, error) {
|
||||
templateFile, err := os.OpenFile(templateAbsolutePath, os.O_TRUNC|os.O_CREATE|os.O_WRONLY, 0644)
|
||||
if err != nil {
|
||||
return "", fmt.Errorf("could not create template file: %w", err)
|
||||
}
|
||||
|
||||
zipTemplateFileReader, err := zipTemplateFile.Open()
|
||||
if err != nil {
|
||||
_ = templateFile.Close()
|
||||
return "", fmt.Errorf("could not open archive to extract file: %w", err)
|
||||
}
|
||||
|
||||
md5Hash := md5.New()
|
||||
|
||||
// Save file and also read into hash.Hash for md5
|
||||
if _, err := io.Copy(templateFile, io.TeeReader(zipTemplateFileReader, md5Hash)); err != nil {
|
||||
_ = templateFile.Close()
|
||||
return "", fmt.Errorf("could not write template file: %w", err)
|
||||
}
|
||||
|
||||
if err := templateFile.Close(); err != nil {
|
||||
return "", fmt.Errorf("could not close file newly created template file: %w", err)
|
||||
}
|
||||
|
||||
checksum := hex.EncodeToString(md5Hash.Sum(nil))
|
||||
return checksum, nil
|
||||
}
|
||||
|
||||
func calculateTemplateAbsolutePath(zipFilePath, configuredTemplateDirectory string) (string, bool, error) {
|
||||
directory, fileName := filepath.Split(zipFilePath)
|
||||
|
||||
if strings.TrimSpace(fileName) == "" || strings.HasPrefix(fileName, ".") || strings.EqualFold(fileName, "README.md") {
|
||||
return "", true, nil
|
||||
}
|
||||
|
||||
directoryPathChunks := strings.Split(directory, string(os.PathSeparator))
|
||||
relativeDirectoryPathWithoutZipRoot := filepath.Join(directoryPathChunks[1:]...)
|
||||
|
||||
if strings.HasPrefix(relativeDirectoryPathWithoutZipRoot, ".") {
|
||||
return "", true, nil
|
||||
}
|
||||
|
||||
templateDirectory := filepath.Join(configuredTemplateDirectory, relativeDirectoryPathWithoutZipRoot)
|
||||
|
||||
if err := os.MkdirAll(templateDirectory, 0755); err != nil {
|
||||
return "", false, fmt.Errorf("failed to create template folder: %s. %w", templateDirectory, err)
|
||||
}
|
||||
|
||||
return filepath.Join(templateDirectory, fileName), false, nil
|
||||
}
|
||||
|
||||
// createTemplateChecksumsMap reads the previous checksum file from the disk.
|
||||
// Creates a map of template paths and their previous and currently calculated checksums as values.
|
||||
func createTemplateChecksumsMap(checksumsFilePath string) (map[string][2]string, error) {
|
||||
|
||||
@ -12,10 +12,11 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
|
||||
)
|
||||
|
||||
func TestDownloadReleaseAndUnzipAddition(t *testing.T) {
|
||||
@ -25,7 +26,7 @@ func TestDownloadReleaseAndUnzipAddition(t *testing.T) {
|
||||
require.Nil(t, err, "could not create temp directory")
|
||||
defer os.RemoveAll(baseTemplates)
|
||||
|
||||
err = ioutil.WriteFile(filepath.Join(baseTemplates, "base.yaml"), []byte("id: test"), 0777)
|
||||
err = ioutil.WriteFile(filepath.Join(baseTemplates, "base.yaml"), []byte("id: test"), os.ModePerm)
|
||||
require.Nil(t, err, "could not create write base file")
|
||||
|
||||
err = zipFromDirectory("base.zip", baseTemplates)
|
||||
@ -50,9 +51,9 @@ func TestDownloadReleaseAndUnzipAddition(t *testing.T) {
|
||||
require.Nil(t, err, "could not create temp directory")
|
||||
defer os.RemoveAll(newTempDir)
|
||||
|
||||
err = ioutil.WriteFile(filepath.Join(newTempDir, "base.yaml"), []byte("id: test"), 0777)
|
||||
err = ioutil.WriteFile(filepath.Join(newTempDir, "base.yaml"), []byte("id: test"), os.ModePerm)
|
||||
require.Nil(t, err, "could not create base file")
|
||||
err = ioutil.WriteFile(filepath.Join(newTempDir, "new.yaml"), []byte("id: test"), 0777)
|
||||
err = ioutil.WriteFile(filepath.Join(newTempDir, "new.yaml"), []byte("id: test"), os.ModePerm)
|
||||
require.Nil(t, err, "could not create new file")
|
||||
|
||||
err = zipFromDirectory("new.zip", newTempDir)
|
||||
@ -77,7 +78,7 @@ func TestDownloadReleaseAndUnzipDeletion(t *testing.T) {
|
||||
require.Nil(t, err, "could not create temp directory")
|
||||
defer os.RemoveAll(baseTemplates)
|
||||
|
||||
err = ioutil.WriteFile(filepath.Join(baseTemplates, "base.yaml"), []byte("id: test"), 0777)
|
||||
err = ioutil.WriteFile(filepath.Join(baseTemplates, "base.yaml"), []byte("id: test"), os.ModePerm)
|
||||
require.Nil(t, err, "could not create write base file")
|
||||
|
||||
err = zipFromDirectory("base.zip", baseTemplates)
|
||||
@ -118,6 +119,43 @@ func TestDownloadReleaseAndUnzipDeletion(t *testing.T) {
|
||||
require.Equal(t, "base.yaml", results.deletions[0], "could not get correct new deletions")
|
||||
}
|
||||
|
||||
func TestCalculateTemplateAbsolutePath(t *testing.T) {
|
||||
configuredTemplateDirectory := filepath.Join(os.TempDir(), "templates")
|
||||
defer os.RemoveAll(configuredTemplateDirectory)
|
||||
|
||||
t.Run("positive scenarios", func(t *testing.T) {
|
||||
zipFilePathsExpectedPathsMap := map[string]string{
|
||||
"nuclei-templates/cve/test.yaml": filepath.Join(configuredTemplateDirectory, "cve/test.yaml"),
|
||||
"nuclei-templates/cve/test/test.yaml": filepath.Join(configuredTemplateDirectory, "cve/test/test.yaml"),
|
||||
}
|
||||
|
||||
for filePathFromZip, expectedTemplateAbsPath := range zipFilePathsExpectedPathsMap {
|
||||
calculatedTemplateAbsPath, skipFile, err := calculateTemplateAbsolutePath(filePathFromZip, configuredTemplateDirectory)
|
||||
require.Nil(t, err)
|
||||
require.Equal(t, expectedTemplateAbsPath, calculatedTemplateAbsPath)
|
||||
require.False(t, skipFile)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("negative scenarios", func(t *testing.T) {
|
||||
filePathsFromZip := []string{
|
||||
"./../nuclei-templates/../cve/test.yaml",
|
||||
"nuclei-templates/../cve/test.yaml",
|
||||
"nuclei-templates/cve/../test.yaml",
|
||||
"nuclei-templates/././../cve/test.yaml",
|
||||
"nuclei-templates/.././../cve/test.yaml",
|
||||
"nuclei-templates/.././../cve/../test.yaml",
|
||||
}
|
||||
|
||||
for _, filePathFromZip := range filePathsFromZip {
|
||||
calculatedTemplateAbsPath, skipFile, err := calculateTemplateAbsolutePath(filePathFromZip, configuredTemplateDirectory)
|
||||
require.Nil(t, err)
|
||||
require.True(t, skipFile)
|
||||
require.Equal(t, "", calculatedTemplateAbsPath)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func zipFromDirectory(zipPath, directory string) error {
|
||||
file, err := os.Create(zipPath)
|
||||
if err != nil {
|
||||
|
||||
@ -26,7 +26,7 @@ type Config struct {
|
||||
const nucleiConfigFilename = ".templates-config.json"
|
||||
|
||||
// Version is the current version of nuclei
|
||||
const Version = `2.5.3`
|
||||
const Version = `2.5.4`
|
||||
|
||||
func getConfigDetails() (string, error) {
|
||||
homeDir, err := os.UserHomeDir()
|
||||
@ -34,7 +34,7 @@ func getConfigDetails() (string, error) {
|
||||
return "", errors.Wrap(err, "could not get home directory")
|
||||
}
|
||||
configDir := filepath.Join(homeDir, ".config", "nuclei")
|
||||
_ = os.MkdirAll(configDir, os.ModePerm)
|
||||
_ = os.MkdirAll(configDir, 0755)
|
||||
templatesConfigFile := filepath.Join(configDir, nucleiConfigFilename)
|
||||
return templatesConfigFile, nil
|
||||
}
|
||||
@ -67,7 +67,7 @@ func WriteConfiguration(config *Config) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
file, err := os.OpenFile(templatesConfigFile, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0777)
|
||||
file, err := os.OpenFile(templatesConfigFile, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0644)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
@ -112,7 +112,7 @@ func getIgnoreFilePath() string {
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
configDir := filepath.Join(home, ".config", "nuclei")
|
||||
_ = os.MkdirAll(configDir, os.ModePerm)
|
||||
_ = os.MkdirAll(configDir, 0755)
|
||||
|
||||
defIgnoreFilePath = filepath.Join(configDir, nucleiIgnoreFile)
|
||||
return defIgnoreFilePath
|
||||
|
||||
@ -7,6 +7,7 @@ import (
|
||||
|
||||
"github.com/karrick/godirwalk"
|
||||
"github.com/pkg/errors"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
)
|
||||
|
||||
@ -79,7 +80,7 @@ func (c *Catalog) GetTemplatePath(target string) ([]string, error) {
|
||||
}
|
||||
|
||||
// convertPathToAbsolute resolves the paths provided to absolute paths
|
||||
// before doing any operations on them regardless of them being blob, folders, files, etc.
|
||||
// before doing any operations on them regardless of them being BLOB, folders, files, etc.
|
||||
func (c *Catalog) convertPathToAbsolute(t string) (string, error) {
|
||||
if strings.Contains(t, "*") {
|
||||
file := filepath.Base(t)
|
||||
|
||||
@ -5,6 +5,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
)
|
||||
|
||||
// TagFilter is used to filter nuclei templates for tag based execution
|
||||
@ -15,6 +16,8 @@ type TagFilter struct {
|
||||
authors map[string]struct{}
|
||||
block map[string]struct{}
|
||||
matchAllows map[string]struct{}
|
||||
types map[types.ProtocolType]struct{}
|
||||
excludeTypes map[types.ProtocolType]struct{}
|
||||
}
|
||||
|
||||
// ErrExcluded is returned for excluded templates
|
||||
@ -25,7 +28,7 @@ var ErrExcluded = errors.New("the template was excluded")
|
||||
// unless it is explicitly specified by user using the includeTags (matchAllows field).
|
||||
// Matching rule: (tag1 OR tag2...) AND (author1 OR author2...) AND (severity1 OR severity2...) AND (extraTags1 OR extraTags2...)
|
||||
// Returns true if the template matches the filter criteria, false otherwise.
|
||||
func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templateSeverity severity.Severity, extraTags []string) (bool, error) {
|
||||
func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templateSeverity severity.Severity, extraTags []string, templateType types.ProtocolType) (bool, error) {
|
||||
for _, templateTag := range templateTags {
|
||||
_, blocked := tagFilter.block[templateTag]
|
||||
_, allowed := tagFilter.matchAllows[templateTag]
|
||||
@ -51,6 +54,9 @@ func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templa
|
||||
return false, nil
|
||||
}
|
||||
|
||||
if !isTemplateTypeMatch(tagFilter, templateType) {
|
||||
return false, nil
|
||||
}
|
||||
return true, nil
|
||||
}
|
||||
|
||||
@ -116,6 +122,27 @@ func isTagMatch(tagFilter *TagFilter, templateTags []string) bool {
|
||||
return false
|
||||
}
|
||||
|
||||
func isTemplateTypeMatch(tagFilter *TagFilter, templateType types.ProtocolType) bool {
|
||||
if len(tagFilter.excludeTypes) == 0 && len(tagFilter.types) == 0 {
|
||||
return true
|
||||
}
|
||||
if templateType.String() == "" || templateType == types.InvalidProtocol {
|
||||
return true
|
||||
}
|
||||
|
||||
included := true
|
||||
if len(tagFilter.types) > 0 {
|
||||
_, included = tagFilter.types[templateType]
|
||||
}
|
||||
|
||||
excluded := false
|
||||
if len(tagFilter.excludeTypes) > 0 {
|
||||
_, excluded = tagFilter.excludeTypes[templateType]
|
||||
}
|
||||
|
||||
return included && !excluded
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
Tags []string
|
||||
ExcludeTags []string
|
||||
@ -123,6 +150,8 @@ type Config struct {
|
||||
Severities severity.Severities
|
||||
ExcludeSeverities severity.Severities
|
||||
IncludeTags []string
|
||||
Protocols types.ProtocolTypes
|
||||
ExcludeProtocols types.ProtocolTypes
|
||||
}
|
||||
|
||||
// New returns a tag filter for nuclei tag based execution
|
||||
@ -136,6 +165,8 @@ func New(config *Config) *TagFilter {
|
||||
excludeSeverities: make(map[severity.Severity]struct{}),
|
||||
block: make(map[string]struct{}),
|
||||
matchAllows: make(map[string]struct{}),
|
||||
types: make(map[types.ProtocolType]struct{}),
|
||||
excludeTypes: make(map[types.ProtocolType]struct{}),
|
||||
}
|
||||
for _, tag := range config.ExcludeTags {
|
||||
for _, val := range splitCommaTrim(tag) {
|
||||
@ -177,6 +208,16 @@ func New(config *Config) *TagFilter {
|
||||
delete(filter.block, val)
|
||||
}
|
||||
}
|
||||
for _, tag := range config.Protocols {
|
||||
if _, ok := filter.types[tag]; !ok {
|
||||
filter.types[tag] = struct{}{}
|
||||
}
|
||||
}
|
||||
for _, tag := range config.ExcludeProtocols {
|
||||
if _, ok := filter.excludeTypes[tag]; !ok {
|
||||
filter.excludeTypes[tag] = struct{}{}
|
||||
}
|
||||
}
|
||||
return filter
|
||||
}
|
||||
|
||||
@ -189,9 +230,9 @@ func splitCommaTrim(value string) []string {
|
||||
if !strings.Contains(value, ",") {
|
||||
return []string{strings.ToLower(value)}
|
||||
}
|
||||
splitted := strings.Split(value, ",")
|
||||
final := make([]string, len(splitted))
|
||||
for i, value := range splitted {
|
||||
split := strings.Split(value, ",")
|
||||
final := make([]string, len(split))
|
||||
for i, value := range split {
|
||||
final[i] = strings.ToLower(strings.TrimSpace(value))
|
||||
}
|
||||
return final
|
||||
|
||||
@ -6,6 +6,7 @@ import (
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
)
|
||||
|
||||
func TestTagBasedFilter(t *testing.T) {
|
||||
@ -15,19 +16,19 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
})
|
||||
|
||||
t.Run("true", func(t *testing.T) {
|
||||
matched, _ := filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, _ := filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("false", func(t *testing.T) {
|
||||
matched, _ := filter.Match([]string{"consul"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, _ := filter.Match([]string{"consul"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-extra-tags-positive", func(t *testing.T) {
|
||||
matched, _ := filter.Match([]string{"cves", "vuln"}, []string{"pdteam"}, severity.Low, []string{"vuln"})
|
||||
matched, _ := filter.Match([]string{"cves", "vuln"}, []string{"pdteam"}, severity.Low, []string{"vuln"}, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-extra-tags-negative", func(t *testing.T) {
|
||||
matched, _ := filter.Match([]string{"cves"}, []string{"pdteam"}, severity.Low, []string{"vuln"})
|
||||
matched, _ := filter.Match([]string{"cves"}, []string{"pdteam"}, severity.Low, []string{"vuln"}, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
}
|
||||
@ -36,7 +37,7 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
ExcludeTags: []string{"dos"},
|
||||
})
|
||||
matched, err := filter.Match([]string{"dos"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, err := filter.Match([]string{"dos"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
require.Equal(t, ErrExcluded, err, "could not get correct error")
|
||||
})
|
||||
@ -46,7 +47,7 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
ExcludeTags: []string{"dos", "fuzz"},
|
||||
IncludeTags: []string{"fuzz"},
|
||||
})
|
||||
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.Nil(t, err, "could not get match")
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
@ -55,7 +56,7 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
Tags: []string{"fuzz"},
|
||||
ExcludeTags: []string{"fuzz"},
|
||||
})
|
||||
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.Nil(t, err, "could not get match")
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
@ -63,24 +64,24 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
Authors: []string{"pdteam"},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-severity", func(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
Severities: severity.Severities{severity.High},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil)
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-exclude-severity", func(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
ExcludeSeverities: severity.Severities{severity.Low},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil)
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
|
||||
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-exclude-with-tags", func(t *testing.T) {
|
||||
@ -88,7 +89,7 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
Tags: []string{"tag"},
|
||||
ExcludeTags: []string{"another"},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"another"}, []string{"pdteam"}, severity.High, nil)
|
||||
matched, _ := filter.Match([]string{"another"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-conditions", func(t *testing.T) {
|
||||
@ -97,16 +98,33 @@ func TestTagBasedFilter(t *testing.T) {
|
||||
Tags: []string{"jira"},
|
||||
Severities: severity.Severities{severity.High},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"jira", "cve"}, []string{"pdteam", "someOtherUser"}, severity.High, nil)
|
||||
matched, _ := filter.Match([]string{"jira", "cve"}, []string{"pdteam", "someOtherUser"}, severity.High, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
|
||||
matched, _ = filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil)
|
||||
matched, _ = filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
|
||||
matched, _ = filter.Match([]string{"jira"}, []string{"random"}, severity.Low, nil)
|
||||
matched, _ = filter.Match([]string{"jira"}, []string{"random"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
|
||||
matched, _ = filter.Match([]string{"consul"}, []string{"random"}, severity.Low, nil)
|
||||
matched, _ = filter.Match([]string{"consul"}, []string{"random"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-type", func(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
Protocols: []types.ProtocolType{types.HTTPProtocol},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
})
|
||||
t.Run("match-exclude-type", func(t *testing.T) {
|
||||
filter := New(&Config{
|
||||
ExcludeProtocols: []types.ProtocolType{types.HTTPProtocol},
|
||||
})
|
||||
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.DNSProtocol)
|
||||
require.True(t, matched, "could not get correct match")
|
||||
|
||||
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
|
||||
require.False(t, matched, "could not get correct match")
|
||||
})
|
||||
}
|
||||
|
||||
@ -10,17 +10,24 @@ import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils/stats"
|
||||
)
|
||||
|
||||
// Config contains the configuration options for the loader
|
||||
type Config struct {
|
||||
Templates []string
|
||||
TemplateURLs []string
|
||||
Workflows []string
|
||||
WorkflowURLs []string
|
||||
ExcludeTemplates []string
|
||||
IncludeTemplates []string
|
||||
|
||||
Tags []string
|
||||
ExcludeTags []string
|
||||
Protocols templateTypes.ProtocolTypes
|
||||
ExcludeProtocols templateTypes.ProtocolTypes
|
||||
Authors []string
|
||||
Severities severity.Severities
|
||||
ExcludeSeverities severity.Severities
|
||||
@ -37,6 +44,7 @@ type Store struct {
|
||||
pathFilter *filter.PathFilter
|
||||
config *Config
|
||||
finalTemplates []string
|
||||
finalWorkflows []string
|
||||
|
||||
templates []*templates.Template
|
||||
workflows []*templates.Template
|
||||
@ -44,6 +52,30 @@ type Store struct {
|
||||
preprocessor templates.Preprocessor
|
||||
}
|
||||
|
||||
// NewConfig returns a new loader config
|
||||
func NewConfig(options *types.Options, catalog *catalog.Catalog, executerOpts protocols.ExecuterOptions) *Config {
|
||||
loaderConfig := Config{
|
||||
Templates: options.Templates,
|
||||
Workflows: options.Workflows,
|
||||
TemplateURLs: options.TemplateURLs,
|
||||
WorkflowURLs: options.WorkflowURLs,
|
||||
ExcludeTemplates: options.ExcludedTemplates,
|
||||
Tags: options.Tags,
|
||||
ExcludeTags: options.ExcludeTags,
|
||||
IncludeTemplates: options.IncludeTemplates,
|
||||
Authors: options.Authors,
|
||||
Severities: options.Severities,
|
||||
ExcludeSeverities: options.ExcludeSeverities,
|
||||
IncludeTags: options.IncludeTags,
|
||||
TemplatesDirectory: options.TemplatesDirectory,
|
||||
Protocols: options.Protocols,
|
||||
ExcludeProtocols: options.ExcludeProtocols,
|
||||
Catalog: catalog,
|
||||
ExecutorOptions: executerOpts,
|
||||
}
|
||||
return &loaderConfig
|
||||
}
|
||||
|
||||
// New creates a new template store based on provided configuration
|
||||
func New(config *Config) (*Store, error) {
|
||||
// Create a tag filter based on provided configuration
|
||||
@ -56,18 +88,32 @@ func New(config *Config) (*Store, error) {
|
||||
Severities: config.Severities,
|
||||
ExcludeSeverities: config.ExcludeSeverities,
|
||||
IncludeTags: config.IncludeTags,
|
||||
Protocols: config.Protocols,
|
||||
ExcludeProtocols: config.ExcludeProtocols,
|
||||
}),
|
||||
pathFilter: filter.NewPathFilter(&filter.PathFilterConfig{
|
||||
IncludedTemplates: config.IncludeTemplates,
|
||||
ExcludedTemplates: config.ExcludeTemplates,
|
||||
}, config.Catalog),
|
||||
finalTemplates: config.Templates,
|
||||
finalWorkflows: config.Workflows,
|
||||
}
|
||||
|
||||
urlBasedTemplatesProvided := len(config.TemplateURLs) > 0 || len(config.WorkflowURLs) > 0
|
||||
if urlBasedTemplatesProvided {
|
||||
remoteTemplates, remoteWorkflows, err := getRemoteTemplatesAndWorkflows(config.TemplateURLs, config.WorkflowURLs)
|
||||
if err != nil {
|
||||
return store, err
|
||||
}
|
||||
store.finalTemplates = append(store.finalTemplates, remoteTemplates...)
|
||||
store.finalWorkflows = append(store.finalWorkflows, remoteWorkflows...)
|
||||
}
|
||||
|
||||
// Handle a case with no templates or workflows, where we use base directory
|
||||
if len(config.Templates) == 0 && len(config.Workflows) == 0 {
|
||||
config.Templates = append(config.Templates, config.TemplatesDirectory)
|
||||
if len(store.finalTemplates) == 0 && len(store.finalWorkflows) == 0 && !urlBasedTemplatesProvided {
|
||||
store.finalTemplates = []string{config.TemplatesDirectory}
|
||||
}
|
||||
store.finalTemplates = append(store.finalTemplates, config.Templates...)
|
||||
|
||||
return store, nil
|
||||
}
|
||||
|
||||
@ -90,12 +136,16 @@ func (store *Store) RegisterPreprocessor(preprocessor templates.Preprocessor) {
|
||||
// the complete compiled templates for a nuclei execution configuration.
|
||||
func (store *Store) Load() {
|
||||
store.templates = store.LoadTemplates(store.finalTemplates)
|
||||
store.workflows = store.LoadWorkflows(store.config.Workflows)
|
||||
store.workflows = store.LoadWorkflows(store.finalWorkflows)
|
||||
}
|
||||
|
||||
// ValidateTemplates takes a list of templates and validates them
|
||||
// erroring out on discovering any faulty templates.
|
||||
func (store *Store) ValidateTemplates(templatesList, workflowsList []string) error {
|
||||
// consider all the templates by default if no templates passed by user
|
||||
if len(templatesList) == 0 {
|
||||
templatesList = store.finalTemplates
|
||||
}
|
||||
templatePaths := store.config.Catalog.GetTemplatesPath(templatesList)
|
||||
workflowPaths := store.config.Catalog.GetTemplatesPath(workflowsList)
|
||||
|
||||
@ -169,6 +219,7 @@ func (store *Store) LoadTemplates(templatesList []string) []*templates.Template
|
||||
if loaded {
|
||||
parsed, err := templates.Parse(templatePath, store.preprocessor, store.config.ExecutorOptions)
|
||||
if err != nil {
|
||||
stats.Increment(parsers.RuntimeWarningsStats)
|
||||
gologger.Warning().Msgf("Could not parse template %s: %s\n", templatePath, err)
|
||||
} else if parsed != nil {
|
||||
loadedTemplates = append(loadedTemplates, parsed)
|
||||
|
||||
95
v2/pkg/catalog/loader/remote_loader.go
Normal file
95
v2/pkg/catalog/loader/remote_loader.go
Normal file
@ -0,0 +1,95 @@
|
||||
package loader
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"strings"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
type ContentType string
|
||||
|
||||
const (
|
||||
Template ContentType = "Template"
|
||||
Workflow ContentType = "Workflow"
|
||||
)
|
||||
|
||||
type RemoteContentError struct {
|
||||
Content []string
|
||||
Type ContentType
|
||||
Error error
|
||||
}
|
||||
|
||||
func getRemoteTemplatesAndWorkflows(templateURLs []string, workflowURLs []string) ([]string, []string, error) {
|
||||
remoteContentErrorChannel := make(chan RemoteContentError)
|
||||
|
||||
for _, templateURL := range templateURLs {
|
||||
go getRemoteContent(templateURL, remoteContentErrorChannel, Template)
|
||||
}
|
||||
for _, workflowURL := range workflowURLs {
|
||||
go getRemoteContent(workflowURL, remoteContentErrorChannel, Workflow)
|
||||
}
|
||||
|
||||
var remoteTemplateList []string
|
||||
var remoteWorkFlowList []string
|
||||
var err error
|
||||
for i := 0; i < (len(templateURLs) + len(workflowURLs)); i++ {
|
||||
remoteContentError := <-remoteContentErrorChannel
|
||||
if remoteContentError.Error != nil {
|
||||
if err != nil {
|
||||
err = errors.New(remoteContentError.Error.Error() + ": " + err.Error())
|
||||
} else {
|
||||
err = remoteContentError.Error
|
||||
}
|
||||
} else {
|
||||
if remoteContentError.Type == Template {
|
||||
remoteTemplateList = append(remoteTemplateList, remoteContentError.Content...)
|
||||
} else if remoteContentError.Type == Workflow {
|
||||
remoteWorkFlowList = append(remoteWorkFlowList, remoteContentError.Content...)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return remoteTemplateList, remoteWorkFlowList, err
|
||||
}
|
||||
|
||||
func getRemoteContent(URL string, w chan<- RemoteContentError, contentType ContentType) {
|
||||
response, err := http.Get(URL)
|
||||
if err != nil {
|
||||
w <- RemoteContentError{
|
||||
Error: err,
|
||||
}
|
||||
return
|
||||
}
|
||||
defer response.Body.Close()
|
||||
if response.StatusCode < 200 || response.StatusCode > 299 {
|
||||
w <- RemoteContentError{
|
||||
Error: fmt.Errorf("get \"%s\": unexpect status %d", URL, response.StatusCode),
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
scanner := bufio.NewScanner(response.Body)
|
||||
var templateList []string
|
||||
for scanner.Scan() {
|
||||
text := strings.TrimSpace(scanner.Text())
|
||||
if text == "" {
|
||||
continue
|
||||
}
|
||||
templateList = append(templateList, text)
|
||||
}
|
||||
|
||||
if err := scanner.Err(); err != nil {
|
||||
w <- RemoteContentError{
|
||||
Error: errors.Wrap(err, "get \"%s\""),
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
w <- RemoteContentError{
|
||||
Content: templateList,
|
||||
Type: contentType,
|
||||
}
|
||||
}
|
||||
59
v2/pkg/core/engine.go
Normal file
59
v2/pkg/core/engine.go
Normal file
@ -0,0 +1,59 @@
|
||||
package core
|
||||
|
||||
import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
// Engine is an executer for running Nuclei Templates/Workflows.
|
||||
//
|
||||
// The engine contains multiple thread pools which allow using different
|
||||
// concurrency values per protocol executed.
|
||||
//
|
||||
// The engine does most of the heavy lifting of execution, from clustering
|
||||
// templates to leading to the final execution by the workpool, it is
|
||||
// handled by the engine.
|
||||
type Engine struct {
|
||||
workPool *WorkPool
|
||||
options *types.Options
|
||||
executerOpts protocols.ExecuterOptions
|
||||
}
|
||||
|
||||
// InputProvider is an input providing interface for the nuclei execution
|
||||
// engine.
|
||||
//
|
||||
// An example InputProvider implementation is provided in form of hybrid
|
||||
// input provider in pkg/core/inputs/hybrid/hmap.go
|
||||
type InputProvider interface {
|
||||
// Count returns the number of items for input provider
|
||||
Count() int64
|
||||
// Scan iterates the input and each found item is passed to the
|
||||
// callback consumer.
|
||||
Scan(callback func(value string))
|
||||
}
|
||||
|
||||
// New returns a new Engine instance
|
||||
func New(options *types.Options) *Engine {
|
||||
workPool := NewWorkPool(WorkPoolConfig{
|
||||
InputConcurrency: options.BulkSize,
|
||||
TypeConcurrency: options.TemplateThreads,
|
||||
HeadlessInputConcurrency: options.HeadlessBulkSize,
|
||||
HeadlessTypeConcurrency: options.HeadlessTemplateThreads,
|
||||
})
|
||||
engine := &Engine{
|
||||
options: options,
|
||||
workPool: workPool,
|
||||
}
|
||||
return engine
|
||||
}
|
||||
|
||||
// SetExecuterOptions sets the executer options for the engine. This is required
|
||||
// before using the engine to perform any execution.
|
||||
func (e *Engine) SetExecuterOptions(options protocols.ExecuterOptions) {
|
||||
e.executerOpts = options
|
||||
}
|
||||
|
||||
// ExecuterOptions returns protocols.ExecuterOptions for nuclei engine.
|
||||
func (e *Engine) ExecuterOptions() protocols.ExecuterOptions {
|
||||
return e.executerOpts
|
||||
}
|
||||
1
v2/pkg/core/engine_test.go
Normal file
1
v2/pkg/core/engine_test.go
Normal file
@ -0,0 +1 @@
|
||||
package core
|
||||
96
v2/pkg/core/execute.go
Normal file
96
v2/pkg/core/execute.go
Normal file
@ -0,0 +1,96 @@
|
||||
package core
|
||||
|
||||
import (
|
||||
"github.com/remeh/sizedwaitgroup"
|
||||
"go.uber.org/atomic"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
)
|
||||
|
||||
// Execute takes a list of templates/workflows that have been compiled
|
||||
// and executes them based on provided concurrency options.
|
||||
//
|
||||
// All the execution logic for the templates/workflows happens in this part
|
||||
// of the engine.
|
||||
func (e *Engine) Execute(templates []*templates.Template, target InputProvider) *atomic.Bool {
|
||||
return e.ExecuteWithOpts(templates, target, false)
|
||||
}
|
||||
|
||||
// ExecuteWithOpts executes with the full options
|
||||
func (e *Engine) ExecuteWithOpts(templatesList []*templates.Template, target InputProvider, noCluster bool) *atomic.Bool {
|
||||
var finalTemplates []*templates.Template
|
||||
if !noCluster {
|
||||
finalTemplates, _ = templates.ClusterTemplates(templatesList, e.executerOpts)
|
||||
} else {
|
||||
finalTemplates = templatesList
|
||||
}
|
||||
|
||||
results := &atomic.Bool{}
|
||||
for _, template := range finalTemplates {
|
||||
templateType := template.Type()
|
||||
|
||||
var wg *sizedwaitgroup.SizedWaitGroup
|
||||
if templateType == types.HeadlessProtocol {
|
||||
wg = e.workPool.Headless
|
||||
} else {
|
||||
wg = e.workPool.Default
|
||||
}
|
||||
|
||||
wg.Add()
|
||||
go func(tpl *templates.Template) {
|
||||
switch {
|
||||
case tpl.SelfContained:
|
||||
// Self Contained requests are executed here separately
|
||||
e.executeSelfContainedTemplateWithInput(tpl, results)
|
||||
default:
|
||||
// All other request types are executed here
|
||||
e.executeModelWithInput(templateType, tpl, target, results)
|
||||
}
|
||||
wg.Done()
|
||||
}(template)
|
||||
}
|
||||
e.workPool.Wait()
|
||||
return results
|
||||
}
|
||||
|
||||
// processSelfContainedTemplates execute a self-contained template.
|
||||
func (e *Engine) executeSelfContainedTemplateWithInput(template *templates.Template, results *atomic.Bool) {
|
||||
match, err := template.Executer.Execute("")
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", e.executerOpts.Colorizer.BrightBlue(template.ID), err)
|
||||
}
|
||||
results.CAS(false, match)
|
||||
}
|
||||
|
||||
// executeModelWithInput executes a type of template with input
|
||||
func (e *Engine) executeModelWithInput(templateType types.ProtocolType, template *templates.Template, target InputProvider, results *atomic.Bool) {
|
||||
wg := e.workPool.InputPool(templateType)
|
||||
|
||||
target.Scan(func(scannedValue string) {
|
||||
// Skip if the host has had errors
|
||||
if e.executerOpts.HostErrorsCache != nil && e.executerOpts.HostErrorsCache.Check(scannedValue) {
|
||||
return
|
||||
}
|
||||
|
||||
wg.WaitGroup.Add()
|
||||
go func(value string) {
|
||||
defer wg.WaitGroup.Done()
|
||||
|
||||
var match bool
|
||||
var err error
|
||||
switch templateType {
|
||||
case types.WorkflowProtocol:
|
||||
match = e.executeWorkflow(value, template.CompiledWorkflow)
|
||||
default:
|
||||
match, err = template.Executer.Execute(value)
|
||||
}
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", e.executerOpts.Colorizer.BrightBlue(template.ID), err)
|
||||
}
|
||||
results.CAS(false, match)
|
||||
}(scannedValue)
|
||||
})
|
||||
wg.WaitGroup.Wait()
|
||||
}
|
||||
135
v2/pkg/core/inputs/hybrid/hmap.go
Normal file
135
v2/pkg/core/inputs/hybrid/hmap.go
Normal file
@ -0,0 +1,135 @@
|
||||
// Package hybrid implements a hybrid hmap/filekv backed input provider
|
||||
// for nuclei that can either stream or store results using different kv stores.
|
||||
package hybrid
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
|
||||
"github.com/projectdiscovery/filekv"
|
||||
"github.com/projectdiscovery/fileutil"
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/hmap/store/hybrid"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
// Input is a hmap/filekv backed nuclei Input provider
|
||||
type Input struct {
|
||||
inputCount int64
|
||||
dupeCount int64
|
||||
hostMap *hybrid.HybridMap
|
||||
hostMapStream *filekv.FileDB
|
||||
}
|
||||
|
||||
// New creates a new hmap backed nuclei Input Provider
|
||||
// and initializes it based on the passed options Model.
|
||||
func New(options *types.Options) (*Input, error) {
|
||||
hm, err := hybrid.New(hybrid.DefaultDiskOptions)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary input file")
|
||||
}
|
||||
|
||||
input := &Input{hostMap: hm}
|
||||
if options.Stream {
|
||||
fkvOptions := filekv.DefaultOptions
|
||||
if tmpFileName, err := fileutil.GetTempFileName(); err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary input file")
|
||||
} else {
|
||||
fkvOptions.Path = tmpFileName
|
||||
}
|
||||
fkv, err := filekv.Open(fkvOptions)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create temporary unsorted input file")
|
||||
}
|
||||
input.hostMapStream = fkv
|
||||
}
|
||||
if initErr := input.initializeInputSources(options); initErr != nil {
|
||||
return nil, initErr
|
||||
}
|
||||
if input.dupeCount > 0 {
|
||||
gologger.Info().Msgf("Supplied input was automatically deduplicated (%d removed).", input.dupeCount)
|
||||
}
|
||||
return input, nil
|
||||
}
|
||||
|
||||
// Close closes the input provider
|
||||
func (i *Input) Close() {
|
||||
i.hostMap.Close()
|
||||
if i.hostMapStream != nil {
|
||||
i.hostMapStream.Close()
|
||||
}
|
||||
}
|
||||
|
||||
// initializeInputSources initializes the input sources for hmap input
|
||||
func (i *Input) initializeInputSources(options *types.Options) error {
|
||||
// Handle targets flags
|
||||
for _, target := range options.Targets {
|
||||
i.normalizeStoreInputValue(target)
|
||||
}
|
||||
|
||||
// Handle stdin
|
||||
if options.Stdin {
|
||||
i.scanInputFromReader(os.Stdin)
|
||||
}
|
||||
|
||||
// Handle target file
|
||||
if options.TargetsFilePath != "" {
|
||||
input, inputErr := os.Open(options.TargetsFilePath)
|
||||
if inputErr != nil {
|
||||
return errors.Wrap(inputErr, "could not open targets file")
|
||||
}
|
||||
i.scanInputFromReader(input)
|
||||
input.Close()
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// scanInputFromReader scans a line of input from reader and passes it for storage
|
||||
func (i *Input) scanInputFromReader(reader io.Reader) {
|
||||
scanner := bufio.NewScanner(reader)
|
||||
for scanner.Scan() {
|
||||
i.normalizeStoreInputValue(scanner.Text())
|
||||
}
|
||||
}
|
||||
|
||||
// normalizeStoreInputValue normalizes and stores passed input values
|
||||
func (i *Input) normalizeStoreInputValue(value string) {
|
||||
url := strings.TrimSpace(value)
|
||||
if url == "" {
|
||||
return
|
||||
}
|
||||
|
||||
if _, ok := i.hostMap.Get(url); ok {
|
||||
i.dupeCount++
|
||||
return
|
||||
}
|
||||
|
||||
i.inputCount++
|
||||
_ = i.hostMap.Set(url, nil)
|
||||
if i.hostMapStream != nil {
|
||||
_ = i.hostMapStream.Set([]byte(url), nil)
|
||||
}
|
||||
}
|
||||
|
||||
// Count returns the input count
|
||||
func (i *Input) Count() int64 {
|
||||
return i.inputCount
|
||||
}
|
||||
|
||||
// Scan iterates the input and each found item is passed to the
|
||||
// callback consumer.
|
||||
func (i *Input) Scan(callback func(value string)) {
|
||||
callbackFunc := func(k, _ []byte) error {
|
||||
callback(string(k))
|
||||
return nil
|
||||
}
|
||||
if i.hostMapStream != nil {
|
||||
_ = i.hostMapStream.Scan(callbackFunc)
|
||||
} else {
|
||||
i.hostMap.Scan(callbackFunc)
|
||||
}
|
||||
}
|
||||
17
v2/pkg/core/inputs/inputs.go
Normal file
17
v2/pkg/core/inputs/inputs.go
Normal file
@ -0,0 +1,17 @@
|
||||
package inputs
|
||||
|
||||
type SimpleInputProvider struct {
|
||||
Inputs []string
|
||||
}
|
||||
|
||||
// Count returns the number of items for input provider
|
||||
func (s *SimpleInputProvider) Count() int64 {
|
||||
return int64(len(s.Inputs))
|
||||
}
|
||||
|
||||
// Scan calls a callback function till the input provider is exhausted
|
||||
func (s *SimpleInputProvider) Scan(callback func(value string)) {
|
||||
for _, v := range s.Inputs {
|
||||
callback(v)
|
||||
}
|
||||
}
|
||||
@ -1,21 +1,23 @@
|
||||
package workflows
|
||||
package core
|
||||
|
||||
import (
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/output"
|
||||
"github.com/remeh/sizedwaitgroup"
|
||||
"go.uber.org/atomic"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/output"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/workflows"
|
||||
)
|
||||
|
||||
// RunWorkflow runs a workflow on an input and returns true or false
|
||||
func (w *Workflow) RunWorkflow(input string) bool {
|
||||
// executeWorkflow runs a workflow on an input and returns true or false
|
||||
func (e *Engine) executeWorkflow(input string, w *workflows.Workflow) bool {
|
||||
results := &atomic.Bool{}
|
||||
|
||||
swg := sizedwaitgroup.New(w.Options.Options.TemplateThreads)
|
||||
for _, template := range w.Workflows {
|
||||
swg.Add()
|
||||
func(template *WorkflowTemplate) {
|
||||
if err := w.runWorkflowStep(template, input, results, &swg); err != nil {
|
||||
func(template *workflows.WorkflowTemplate) {
|
||||
if err := e.runWorkflowStep(template, input, results, &swg, w); err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", template.Template, err)
|
||||
}
|
||||
swg.Done()
|
||||
@ -27,7 +29,7 @@ func (w *Workflow) RunWorkflow(input string) bool {
|
||||
|
||||
// runWorkflowStep runs a workflow step for the workflow. It executes the workflow
|
||||
// in a recursive manner running all subtemplates and matchers.
|
||||
func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, results *atomic.Bool, swg *sizedwaitgroup.SizedWaitGroup) error {
|
||||
func (e *Engine) runWorkflowStep(template *workflows.WorkflowTemplate, input string, results *atomic.Bool, swg *sizedwaitgroup.SizedWaitGroup, w *workflows.Workflow) error {
|
||||
var firstMatched bool
|
||||
var err error
|
||||
var mainErr error
|
||||
@ -90,8 +92,8 @@ func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, res
|
||||
for _, subtemplate := range matcher.Subtemplates {
|
||||
swg.Add()
|
||||
|
||||
go func(subtemplate *WorkflowTemplate) {
|
||||
if err := w.runWorkflowStep(subtemplate, input, results, swg); err != nil {
|
||||
go func(subtemplate *workflows.WorkflowTemplate) {
|
||||
if err := e.runWorkflowStep(subtemplate, input, results, swg, w); err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", subtemplate.Template, err)
|
||||
}
|
||||
swg.Done()
|
||||
@ -114,8 +116,8 @@ func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, res
|
||||
for _, subtemplate := range template.Subtemplates {
|
||||
swg.Add()
|
||||
|
||||
go func(template *WorkflowTemplate) {
|
||||
if err := w.runWorkflowStep(template, input, results, swg); err != nil {
|
||||
go func(template *workflows.WorkflowTemplate) {
|
||||
if err := e.runWorkflowStep(template, input, results, swg, w); err != nil {
|
||||
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", template.Template, err)
|
||||
}
|
||||
swg.Done()
|
||||
@ -1,4 +1,4 @@
|
||||
package workflows
|
||||
package core
|
||||
|
||||
import (
|
||||
"testing"
|
||||
@ -10,18 +10,20 @@ import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/workflows"
|
||||
)
|
||||
|
||||
func TestWorkflowsSimple(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.True(t, matched, "could not get correct match value")
|
||||
}
|
||||
|
||||
@ -29,20 +31,21 @@ func TestWorkflowsSimpleMultiple(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
var firstInput, secondInput string
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
firstInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}},
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
secondInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.True(t, matched, "could not get correct match value")
|
||||
|
||||
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
|
||||
@ -53,21 +56,22 @@ func TestWorkflowsSubtemplates(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
var firstInput, secondInput string
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
firstInput = input
|
||||
}, outputs: []*output.InternalWrappedEvent{
|
||||
{OperatorsResult: &operators.Result{}, Results: []*output.ResultEvent{{}}},
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}, Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
|
||||
}, Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
secondInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}}}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.True(t, matched, "could not get correct match value")
|
||||
|
||||
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
|
||||
@ -78,19 +82,20 @@ func TestWorkflowsSubtemplatesNoMatch(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
var firstInput, secondInput string
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: false, executeHook: func(input string) {
|
||||
firstInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}, Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
|
||||
}, Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
secondInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}}}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.False(t, matched, "could not get correct match value")
|
||||
|
||||
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
|
||||
@ -101,8 +106,8 @@ func TestWorkflowsSubtemplatesWithMatcher(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
var firstInput, secondInput string
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
firstInput = input
|
||||
}, outputs: []*output.InternalWrappedEvent{
|
||||
@ -111,14 +116,15 @@ func TestWorkflowsSubtemplatesWithMatcher(t *testing.T) {
|
||||
Extracts: map[string][]string{},
|
||||
}},
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}, Matchers: []*Matcher{{Name: "tomcat", Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
|
||||
}, Matchers: []*workflows.Matcher{{Name: "tomcat", Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
secondInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}}}}}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.True(t, matched, "could not get correct match value")
|
||||
|
||||
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
|
||||
@ -129,8 +135,8 @@ func TestWorkflowsSubtemplatesWithMatcherNoMatch(t *testing.T) {
|
||||
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
|
||||
|
||||
var firstInput, secondInput string
|
||||
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
|
||||
{Executers: []*ProtocolExecuterPair{{
|
||||
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
|
||||
{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
firstInput = input
|
||||
}, outputs: []*output.InternalWrappedEvent{
|
||||
@ -139,14 +145,15 @@ func TestWorkflowsSubtemplatesWithMatcherNoMatch(t *testing.T) {
|
||||
Extracts: map[string][]string{},
|
||||
}},
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}, Matchers: []*Matcher{{Name: "apache", Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
|
||||
}, Matchers: []*workflows.Matcher{{Name: "apache", Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
|
||||
Executer: &mockExecuter{result: true, executeHook: func(input string) {
|
||||
secondInput = input
|
||||
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
|
||||
}}}}}},
|
||||
}}
|
||||
|
||||
matched := workflow.RunWorkflow("https://test.com")
|
||||
engine := &Engine{}
|
||||
matched := engine.executeWorkflow("https://test.com", workflow)
|
||||
require.False(t, matched, "could not get correct match value")
|
||||
|
||||
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
|
||||
65
v2/pkg/core/workpool.go
Normal file
65
v2/pkg/core/workpool.go
Normal file
@ -0,0 +1,65 @@
|
||||
package core
|
||||
|
||||
import (
|
||||
"github.com/remeh/sizedwaitgroup"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
)
|
||||
|
||||
// WorkPool implements an execution pool for executing different
|
||||
// types of task with different concurrency requirements.
|
||||
//
|
||||
// It also allows Configuration of such requirements. This is used
|
||||
// for per-module like separate headless concurrency etc.
|
||||
type WorkPool struct {
|
||||
Headless *sizedwaitgroup.SizedWaitGroup
|
||||
Default *sizedwaitgroup.SizedWaitGroup
|
||||
config WorkPoolConfig
|
||||
}
|
||||
|
||||
// WorkPoolConfig is the configuration for work pool
|
||||
type WorkPoolConfig struct {
|
||||
// InputConcurrency is the concurrency for inputs values.
|
||||
InputConcurrency int
|
||||
// TypeConcurrency is the concurrency for the request type templates.
|
||||
TypeConcurrency int
|
||||
// HeadlessInputConcurrency is the concurrency for headless inputs values.
|
||||
HeadlessInputConcurrency int
|
||||
// TypeConcurrency is the concurrency for the headless request type templates.
|
||||
HeadlessTypeConcurrency int
|
||||
}
|
||||
|
||||
// NewWorkPool returns a new WorkPool instance
|
||||
func NewWorkPool(config WorkPoolConfig) *WorkPool {
|
||||
headlessWg := sizedwaitgroup.New(config.HeadlessTypeConcurrency)
|
||||
defaultWg := sizedwaitgroup.New(config.TypeConcurrency)
|
||||
|
||||
return &WorkPool{
|
||||
config: config,
|
||||
Headless: &headlessWg,
|
||||
Default: &defaultWg,
|
||||
}
|
||||
}
|
||||
|
||||
// Wait waits for all the work pool wait groups to finish
|
||||
func (w *WorkPool) Wait() {
|
||||
w.Default.Wait()
|
||||
w.Headless.Wait()
|
||||
}
|
||||
|
||||
// InputWorkPool is a work pool per-input
|
||||
type InputWorkPool struct {
|
||||
WaitGroup *sizedwaitgroup.SizedWaitGroup
|
||||
}
|
||||
|
||||
// InputPool returns a work pool for an input type
|
||||
func (w *WorkPool) InputPool(templateType types.ProtocolType) *InputWorkPool {
|
||||
var count int
|
||||
if templateType == types.HeadlessProtocol {
|
||||
count = w.config.HeadlessInputConcurrency
|
||||
} else {
|
||||
count = w.config.InputConcurrency
|
||||
}
|
||||
swg := sizedwaitgroup.New(count)
|
||||
return &InputWorkPool{WaitGroup: &swg}
|
||||
}
|
||||
@ -50,13 +50,6 @@ type Info struct {
|
||||
Reference stringslice.StringSlice `json:"reference,omitempty" yaml:"reference,omitempty" jsonschema:"title=references for the template,description=Links relevant to the template"`
|
||||
// description: |
|
||||
// Severity of the template.
|
||||
//
|
||||
// values:
|
||||
// - info
|
||||
// - low
|
||||
// - medium
|
||||
// - high
|
||||
// - critical
|
||||
SeverityHolder severity.Holder `json:"severity,omitempty" yaml:"severity,omitempty"`
|
||||
// description: |
|
||||
// Metadata of the template.
|
||||
|
||||
@ -72,6 +72,7 @@ func TestUnmarshal(t *testing.T) {
|
||||
}
|
||||
|
||||
assertUnmarshalledTemplateInfo := func(t *testing.T, yamlPayload string) Info {
|
||||
t.Helper()
|
||||
info := Info{}
|
||||
err := yaml.Unmarshal([]byte(yamlPayload), &info)
|
||||
assert.Nil(t, err)
|
||||
|
||||
@ -43,7 +43,7 @@ func (severities *Severities) UnmarshalYAML(unmarshal func(interface{}) error) e
|
||||
}
|
||||
|
||||
func (severities Severities) String() string {
|
||||
var stringSeverities []string
|
||||
var stringSeverities = make([]string, 0, len(severities))
|
||||
for _, severity := range severities {
|
||||
stringSeverities = append(stringSeverities, severity.String())
|
||||
}
|
||||
|
||||
@ -1,19 +1,28 @@
|
||||
package severity
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
type Severity int
|
||||
|
||||
// name:Severity
|
||||
const (
|
||||
// name:undefined
|
||||
Undefined Severity = iota
|
||||
// name:info
|
||||
Info
|
||||
// name:low
|
||||
Low
|
||||
// name:medium
|
||||
Medium
|
||||
// name:high
|
||||
High
|
||||
// name:critical
|
||||
Critical
|
||||
limit
|
||||
)
|
||||
@ -51,3 +60,44 @@ func normalizeValue(value string) string {
|
||||
func (severity Severity) String() string {
|
||||
return severityMappings[severity]
|
||||
}
|
||||
|
||||
//nolint:exported,revive //prefer to be explicit about the name, and make it refactor-safe
|
||||
// Holder holds a Severity type. Required for un/marshalling purposes
|
||||
type Holder struct {
|
||||
Severity Severity `mapping:"true"`
|
||||
}
|
||||
|
||||
func (severityHolder Holder) JSONSchemaType() *jsonschema.Type {
|
||||
gotType := &jsonschema.Type{
|
||||
Type: "string",
|
||||
Title: "severity of the template",
|
||||
Description: "Seriousness of the implications of the template",
|
||||
}
|
||||
for _, severity := range GetSupportedSeverities() {
|
||||
gotType.Enum = append(gotType.Enum, severity.String())
|
||||
}
|
||||
return gotType
|
||||
}
|
||||
|
||||
func (severityHolder *Holder) UnmarshalYAML(unmarshal func(interface{}) error) error {
|
||||
var marshalledSeverity string
|
||||
if err := unmarshal(&marshalledSeverity); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
computedSeverity, err := toSeverity(marshalledSeverity)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
severityHolder.Severity = computedSeverity
|
||||
return nil
|
||||
}
|
||||
|
||||
func (severityHolder *Holder) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(severityHolder.Severity.String())
|
||||
}
|
||||
|
||||
func (severityHolder Holder) MarshalYAML() (interface{}, error) {
|
||||
return severityHolder.Severity.String(), nil
|
||||
}
|
||||
|
||||
@ -1,48 +0,0 @@
|
||||
package severity
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
)
|
||||
|
||||
//nolint:exported,revive //prefer to be explicit about the name, and make it refactor-safe
|
||||
// Holder holds a Severity type. Required for un/marshalling purposes
|
||||
type Holder struct {
|
||||
Severity Severity
|
||||
}
|
||||
|
||||
func (severityHolder Holder) JSONSchemaType() *jsonschema.Type {
|
||||
gotType := &jsonschema.Type{
|
||||
Type: "string",
|
||||
Title: "severity of the template",
|
||||
Description: "Seriousness of the implications of the template",
|
||||
}
|
||||
for _, severity := range GetSupportedSeverities() {
|
||||
gotType.Enum = append(gotType.Enum, severity.String())
|
||||
}
|
||||
return gotType
|
||||
}
|
||||
|
||||
func (severityHolder *Holder) UnmarshalYAML(unmarshal func(interface{}) error) error {
|
||||
var marshalledSeverity string
|
||||
if err := unmarshal(&marshalledSeverity); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
computedSeverity, err := toSeverity(marshalledSeverity)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
severityHolder.Severity = computedSeverity
|
||||
return nil
|
||||
}
|
||||
|
||||
func (severityHolder *Holder) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(severityHolder.Severity.String())
|
||||
}
|
||||
|
||||
func (severityHolder Holder) MarshalYAML() (interface{}, error) {
|
||||
return severityHolder.Severity.String(), nil
|
||||
}
|
||||
@ -30,6 +30,7 @@ func TestGetSupportedSeverities(t *testing.T) {
|
||||
}
|
||||
|
||||
func testUnmarshal(t *testing.T, unmarshaller func(data []byte, v interface{}) error, payloadCreator func(value string) string) {
|
||||
t.Helper()
|
||||
payloads := [...]string{
|
||||
payloadCreator("Info"),
|
||||
payloadCreator("info"),
|
||||
@ -48,6 +49,7 @@ func testUnmarshal(t *testing.T, unmarshaller func(data []byte, v interface{}) e
|
||||
}
|
||||
|
||||
func testUnmarshalFail(t *testing.T, unmarshaller func(data []byte, v interface{}) error, payloadCreator func(value string) string) {
|
||||
t.Helper()
|
||||
assert.Panics(t, func() { unmarshal(payloadCreator("invalid"), unmarshaller) })
|
||||
}
|
||||
|
||||
|
||||
@ -80,11 +80,12 @@ func marshalStringToSlice(unmarshal func(interface{}) error) ([]string, error) {
|
||||
}
|
||||
|
||||
var result []string
|
||||
if len(marshalledValuesAsSlice) > 0 {
|
||||
switch {
|
||||
case len(marshalledValuesAsSlice) > 0:
|
||||
result = marshalledValuesAsSlice
|
||||
} else if utils.IsNotBlank(marshalledValueAsString) {
|
||||
case utils.IsNotBlank(marshalledValueAsString):
|
||||
result = strings.Split(marshalledValueAsString, ",")
|
||||
} else {
|
||||
default:
|
||||
result = []string{}
|
||||
}
|
||||
|
||||
|
||||
@ -1,6 +1,8 @@
|
||||
package dsl
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"compress/gzip"
|
||||
"crypto/md5"
|
||||
"crypto/sha1"
|
||||
"crypto/sha256"
|
||||
@ -17,10 +19,11 @@ import (
|
||||
"time"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
"github.com/spaolacci/murmur3"
|
||||
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/deserialization"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
"github.com/spaolacci/murmur3"
|
||||
)
|
||||
|
||||
const (
|
||||
@ -31,21 +34,38 @@ const (
|
||||
withMaxRandArgsSize = withCutSetArgsSize
|
||||
)
|
||||
|
||||
var ErrDSLArguments = errors.New("invalid arguments provided to dsl")
|
||||
|
||||
var functions = map[string]govaluate.ExpressionFunction{
|
||||
"len": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
length := len(types.ToString(args[0]))
|
||||
return float64(length), nil
|
||||
},
|
||||
"toupper": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.ToUpper(types.ToString(args[0])), nil
|
||||
},
|
||||
"tolower": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.ToLower(types.ToString(args[0])), nil
|
||||
},
|
||||
"replace": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 3 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.ReplaceAll(types.ToString(args[0]), types.ToString(args[1]), types.ToString(args[2])), nil
|
||||
},
|
||||
"replace_regex": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 3 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
compiled, err := regexp.Compile(types.ToString(args[1]))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
@ -53,66 +73,133 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return compiled.ReplaceAllString(types.ToString(args[0]), types.ToString(args[2])), nil
|
||||
},
|
||||
"trim": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.Trim(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"trimleft": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.TrimLeft(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"trimright": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.TrimRight(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"trimspace": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.TrimSpace(types.ToString(args[0])), nil
|
||||
},
|
||||
"trimprefix": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.TrimPrefix(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"trimsuffix": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.TrimSuffix(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"reverse": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return reverseString(types.ToString(args[0])), nil
|
||||
},
|
||||
// encoding
|
||||
"base64": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
sEnc := base64.StdEncoding.EncodeToString([]byte(types.ToString(args[0])))
|
||||
|
||||
return sEnc, nil
|
||||
},
|
||||
"gzip": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
buffer := &bytes.Buffer{}
|
||||
writer := gzip.NewWriter(buffer)
|
||||
if _, err := writer.Write([]byte(args[0].(string))); err != nil {
|
||||
return "", err
|
||||
}
|
||||
_ = writer.Close()
|
||||
|
||||
return buffer.String(), nil
|
||||
},
|
||||
// python encodes to base64 with lines of 76 bytes terminated by new line "\n"
|
||||
"base64_py": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
sEnc := base64.StdEncoding.EncodeToString([]byte(types.ToString(args[0])))
|
||||
return deserialization.InsertInto(sEnc, 76, '\n'), nil
|
||||
},
|
||||
"base64_decode": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return base64.StdEncoding.DecodeString(types.ToString(args[0]))
|
||||
},
|
||||
"url_encode": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return url.QueryEscape(types.ToString(args[0])), nil
|
||||
},
|
||||
"url_decode": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return url.QueryUnescape(types.ToString(args[0]))
|
||||
},
|
||||
"hex_encode": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return hex.EncodeToString([]byte(types.ToString(args[0]))), nil
|
||||
},
|
||||
"hex_decode": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
hx, _ := hex.DecodeString(types.ToString(args[0]))
|
||||
return string(hx), nil
|
||||
},
|
||||
"html_escape": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return html.EscapeString(types.ToString(args[0])), nil
|
||||
},
|
||||
"html_unescape": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return html.UnescapeString(types.ToString(args[0])), nil
|
||||
},
|
||||
// hashing
|
||||
"md5": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
hash := md5.Sum([]byte(types.ToString(args[0])))
|
||||
|
||||
return hex.EncodeToString(hash[:]), nil
|
||||
},
|
||||
"sha256": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
h := sha256.New()
|
||||
if _, err := h.Write([]byte(types.ToString(args[0]))); err != nil {
|
||||
return nil, err
|
||||
@ -120,6 +207,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return hex.EncodeToString(h.Sum(nil)), nil
|
||||
},
|
||||
"sha1": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
h := sha1.New()
|
||||
if _, err := h.Write([]byte(types.ToString(args[0]))); err != nil {
|
||||
return nil, err
|
||||
@ -127,13 +217,22 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return hex.EncodeToString(h.Sum(nil)), nil
|
||||
},
|
||||
"mmh3": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return fmt.Sprintf("%d", int32(murmur3.Sum32WithSeed([]byte(types.ToString(args[0])), 0))), nil
|
||||
},
|
||||
// search
|
||||
"contains": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
return strings.Contains(types.ToString(args[0]), types.ToString(args[1])), nil
|
||||
},
|
||||
"regex": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
compiled, err := regexp.Compile(types.ToString(args[0]))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
@ -142,6 +241,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
},
|
||||
// random generators
|
||||
"rand_char": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
chars := letters + numbers
|
||||
bad := ""
|
||||
if len(args) >= 1 {
|
||||
@ -154,6 +256,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return chars[rand.Intn(len(chars))], nil
|
||||
},
|
||||
"rand_base": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 3 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
l := 0
|
||||
bad := ""
|
||||
base := letters + numbers
|
||||
@ -171,6 +276,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return randSeq(base, l), nil
|
||||
},
|
||||
"rand_text_alphanumeric": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
l := 0
|
||||
bad := ""
|
||||
chars := letters + numbers
|
||||
@ -185,6 +293,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return randSeq(chars, l), nil
|
||||
},
|
||||
"rand_text_alpha": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
l := 0
|
||||
bad := ""
|
||||
chars := letters
|
||||
@ -199,6 +310,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return randSeq(chars, l), nil
|
||||
},
|
||||
"rand_text_numeric": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
l := 0
|
||||
bad := ""
|
||||
chars := numbers
|
||||
@ -213,6 +327,9 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
return randSeq(chars, l), nil
|
||||
},
|
||||
"rand_int": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 2 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
min := 0
|
||||
max := math.MaxInt32
|
||||
|
||||
@ -231,16 +348,22 @@ var functions = map[string]govaluate.ExpressionFunction{
|
||||
}
|
||||
now := time.Now()
|
||||
offset := now.Add(time.Duration(seconds) * time.Second)
|
||||
return offset.Unix(), nil
|
||||
return float64(offset.Unix()), nil
|
||||
},
|
||||
// Time Functions
|
||||
"waitfor": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 1 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
seconds := args[0].(float64)
|
||||
time.Sleep(time.Duration(seconds) * time.Second)
|
||||
return true, nil
|
||||
},
|
||||
// deserialization Functions
|
||||
"generate_java_gadget": func(args ...interface{}) (interface{}, error) {
|
||||
if len(args) != 3 {
|
||||
return nil, ErrDSLArguments
|
||||
}
|
||||
gadget := args[0].(string)
|
||||
cmd := args[1].(string)
|
||||
|
||||
|
||||
@ -1,9 +1,16 @@
|
||||
package dsl
|
||||
|
||||
import (
|
||||
"compress/gzip"
|
||||
"io/ioutil"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
)
|
||||
|
||||
func TestDSLURLEncodeDecode(t *testing.T) {
|
||||
@ -17,3 +24,25 @@ func TestDSLURLEncodeDecode(t *testing.T) {
|
||||
require.Nil(t, err, "could not url encode")
|
||||
require.Equal(t, "&test\"", decoded, "could not get url decoded data")
|
||||
}
|
||||
|
||||
func TestDSLTimeComparison(t *testing.T) {
|
||||
compiled, err := govaluate.NewEvaluableExpressionWithFunctions("unixtime() > not_after", HelperFunctions())
|
||||
require.Nil(t, err, "could not compare time")
|
||||
|
||||
result, err := compiled.Evaluate(map[string]interface{}{"not_after": float64(time.Now().Unix() - 1000)})
|
||||
require.Nil(t, err, "could not evaluate compare time")
|
||||
require.Equal(t, true, result, "could not get url encoded data")
|
||||
}
|
||||
|
||||
func TestDSLGzipSerialize(t *testing.T) {
|
||||
compiled, err := govaluate.NewEvaluableExpressionWithFunctions("gzip(\"hello world\")", HelperFunctions())
|
||||
require.Nil(t, err, "could not compare time")
|
||||
|
||||
result, err := compiled.Evaluate(make(map[string]interface{}))
|
||||
require.Nil(t, err, "could not evaluate compare time")
|
||||
|
||||
reader, _ := gzip.NewReader(strings.NewReader(types.ToString(result)))
|
||||
data, _ := ioutil.ReadAll(reader)
|
||||
|
||||
require.Equal(t, "hello world", string(data), "could not get gzip encoded data")
|
||||
}
|
||||
|
||||
@ -10,13 +10,12 @@ import (
|
||||
|
||||
// CompileExtractors performs the initial setup operation on an extractor
|
||||
func (e *Extractor) CompileExtractors() error {
|
||||
var ok bool
|
||||
// Set up the extractor type
|
||||
e.extractorType, ok = ExtractorTypes[e.Type]
|
||||
if !ok {
|
||||
computedType, err := toExtractorTypes(e.GetType().String())
|
||||
if err != nil {
|
||||
return fmt.Errorf("unknown extractor type specified: %s", e.Type)
|
||||
}
|
||||
|
||||
e.extractorType = computedType
|
||||
// Compile the regexes
|
||||
for _, regex := range e.Regex {
|
||||
compiled, err := regexp.Compile(regex)
|
||||
@ -25,7 +24,6 @@ func (e *Extractor) CompileExtractors() error {
|
||||
}
|
||||
e.regexCompiled = append(e.regexCompiled, compiled)
|
||||
}
|
||||
|
||||
for i, kval := range e.KVal {
|
||||
e.KVal[i] = strings.ToLower(kval)
|
||||
}
|
||||
@ -42,9 +40,14 @@ func (e *Extractor) CompileExtractors() error {
|
||||
e.jsonCompiled = append(e.jsonCompiled, compiled)
|
||||
}
|
||||
|
||||
// Set up the part of the request to match, if any.
|
||||
if e.Part == "" {
|
||||
e.Part = "body"
|
||||
if e.CaseInsensitive {
|
||||
if e.GetType() != KValExtractor {
|
||||
return fmt.Errorf("case-insensitive flag is supported only for 'kval' extractors (not '%s')", e.Type)
|
||||
}
|
||||
for i := range e.KVal {
|
||||
e.KVal[i] = strings.ToLower(e.KVal[i])
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -1,9 +1,8 @@
|
||||
package extractors
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"encoding/json"
|
||||
"strings"
|
||||
|
||||
"github.com/antchfx/htmlquery"
|
||||
|
||||
@ -34,8 +33,18 @@ func (e *Extractor) ExtractRegex(corpus string) map[string]struct{} {
|
||||
|
||||
// ExtractKval extracts key value pairs from a data map
|
||||
func (e *Extractor) ExtractKval(data map[string]interface{}) map[string]struct{} {
|
||||
results := make(map[string]struct{})
|
||||
if e.CaseInsensitive {
|
||||
inputData := data
|
||||
data = make(map[string]interface{}, len(inputData))
|
||||
for k, v := range inputData {
|
||||
if s, ok := v.(string); ok {
|
||||
v = strings.ToLower(s)
|
||||
}
|
||||
data[strings.ToLower(k)] = v
|
||||
}
|
||||
}
|
||||
|
||||
results := make(map[string]struct{})
|
||||
for _, k := range e.KVal {
|
||||
item, ok := data[k]
|
||||
if !ok {
|
||||
|
||||
105
v2/pkg/operators/extractors/extractor_types.go
Normal file
105
v2/pkg/operators/extractors/extractor_types.go
Normal file
@ -0,0 +1,105 @@
|
||||
package extractors
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
)
|
||||
|
||||
// ExtractorType is the type of the extractor specified
|
||||
type ExtractorType int
|
||||
|
||||
// name:ExtractorType
|
||||
const (
|
||||
// name:regex
|
||||
RegexExtractor ExtractorType = iota + 1
|
||||
// name:kval
|
||||
KValExtractor
|
||||
// name:xpath
|
||||
XPathExtractor
|
||||
// name:json
|
||||
JSONExtractor
|
||||
limit
|
||||
)
|
||||
|
||||
// extractorMappings is a table for conversion of extractor type from string.
|
||||
var extractorMappings = map[ExtractorType]string{
|
||||
RegexExtractor: "regex",
|
||||
KValExtractor: "kval",
|
||||
XPathExtractor: "xpath",
|
||||
JSONExtractor: "json",
|
||||
}
|
||||
|
||||
// GetType returns the type of the matcher
|
||||
func (e *Extractor) GetType() ExtractorType {
|
||||
return e.Type.ExtractorType
|
||||
}
|
||||
|
||||
// GetSupportedExtractorTypes returns list of supported types
|
||||
func GetSupportedExtractorTypes() []ExtractorType {
|
||||
var result []ExtractorType
|
||||
for index := ExtractorType(1); index < limit; index++ {
|
||||
result = append(result, index)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func toExtractorTypes(valueToMap string) (ExtractorType, error) {
|
||||
normalizedValue := normalizeValue(valueToMap)
|
||||
for key, currentValue := range extractorMappings {
|
||||
if normalizedValue == currentValue {
|
||||
return key, nil
|
||||
}
|
||||
}
|
||||
return -1, errors.New("Invalid extractor type: " + valueToMap)
|
||||
}
|
||||
|
||||
func normalizeValue(value string) string {
|
||||
return strings.TrimSpace(strings.ToLower(value))
|
||||
}
|
||||
|
||||
func (t ExtractorType) String() string {
|
||||
return extractorMappings[t]
|
||||
}
|
||||
|
||||
// ExtractorTypeHolder is used to hold internal type of the extractor
|
||||
type ExtractorTypeHolder struct {
|
||||
ExtractorType ExtractorType `mapping:"true"`
|
||||
}
|
||||
|
||||
func (holder ExtractorTypeHolder) JSONSchemaType() *jsonschema.Type {
|
||||
gotType := &jsonschema.Type{
|
||||
Type: "string",
|
||||
Title: "type of the extractor",
|
||||
Description: "Type of the extractor",
|
||||
}
|
||||
for _, types := range GetSupportedExtractorTypes() {
|
||||
gotType.Enum = append(gotType.Enum, types.String())
|
||||
}
|
||||
return gotType
|
||||
}
|
||||
|
||||
func (holder *ExtractorTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
|
||||
var marshalledTypes string
|
||||
if err := unmarshal(&marshalledTypes); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
computedType, err := toExtractorTypes(marshalledTypes)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
holder.ExtractorType = computedType
|
||||
return nil
|
||||
}
|
||||
|
||||
func (holder *ExtractorTypeHolder) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(holder.ExtractorType.String())
|
||||
}
|
||||
|
||||
func (holder ExtractorTypeHolder) MarshalYAML() (interface{}, error) {
|
||||
return holder.ExtractorType.String(), nil
|
||||
}
|
||||
@ -16,12 +16,7 @@ type Extractor struct {
|
||||
Name string `yaml:"name,omitempty" jsonschema:"title=name of the extractor,description=Name of the extractor"`
|
||||
// description: |
|
||||
// Type is the type of the extractor.
|
||||
// values:
|
||||
// - "regex"
|
||||
// - "kval"
|
||||
// - "json"
|
||||
// - "xpath"
|
||||
Type string `yaml:"type" jsonschema:"title=type of the extractor,description=Type of the extractor,enum=regex,enum=kval,enum=json,enum=xpath"`
|
||||
Type ExtractorTypeHolder `json:"name,omitempty" yaml:"type"`
|
||||
// extractorType is the internal type of the extractor
|
||||
extractorType ExtractorType
|
||||
|
||||
@ -105,31 +100,11 @@ type Extractor struct {
|
||||
// Internal, when set to true will allow using the value extracted
|
||||
// in the next request for some protocols (like HTTP).
|
||||
Internal bool `yaml:"internal,omitempty" jsonschema:"title=mark extracted value for internal variable use,description=Internal when set to true will allow using the value extracted in the next request for some protocols"`
|
||||
}
|
||||
|
||||
// ExtractorType is the type of the extractor specified
|
||||
type ExtractorType = int
|
||||
|
||||
const (
|
||||
// RegexExtractor extracts responses with regexes
|
||||
RegexExtractor ExtractorType = iota + 1
|
||||
// KValExtractor extracts responses with key:value
|
||||
KValExtractor
|
||||
// XPathExtractor extracts responses with Xpath selectors
|
||||
XPathExtractor
|
||||
// JSONExtractor extracts responses with json
|
||||
JSONExtractor
|
||||
)
|
||||
|
||||
// ExtractorTypes is a table for conversion of extractor type from string.
|
||||
var ExtractorTypes = map[string]ExtractorType{
|
||||
"regex": RegexExtractor,
|
||||
"kval": KValExtractor,
|
||||
"xpath": XPathExtractor,
|
||||
"json": JSONExtractor,
|
||||
}
|
||||
|
||||
// GetType returns the type of the matcher
|
||||
func (e *Extractor) GetType() ExtractorType {
|
||||
return e.extractorType
|
||||
|
||||
// description: |
|
||||
// CaseInsensitive enables case-insensitive extractions. Default is false.
|
||||
// values:
|
||||
// - false
|
||||
// - true
|
||||
CaseInsensitive bool `yaml:"case-insensitive,omitempty" jsonschema:"title=use case insensitive extract,description=use case insensitive extract"`
|
||||
}
|
||||
|
||||
@ -4,6 +4,7 @@ import (
|
||||
"encoding/hex"
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
|
||||
@ -11,54 +12,74 @@ import (
|
||||
)
|
||||
|
||||
// CompileMatchers performs the initial setup operation on a matcher
|
||||
func (m *Matcher) CompileMatchers() error {
|
||||
func (matcher *Matcher) CompileMatchers() error {
|
||||
var ok bool
|
||||
|
||||
// Support hexadecimal encoding for matchers too.
|
||||
if m.Encoding == "hex" {
|
||||
for i, word := range m.Words {
|
||||
if matcher.Encoding == "hex" {
|
||||
for i, word := range matcher.Words {
|
||||
if decoded, err := hex.DecodeString(word); err == nil && len(decoded) > 0 {
|
||||
m.Words[i] = string(decoded)
|
||||
matcher.Words[i] = string(decoded)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Set up the matcher type
|
||||
m.matcherType, ok = MatcherTypes[m.Type]
|
||||
if !ok {
|
||||
return fmt.Errorf("unknown matcher type specified: %s", m.Type)
|
||||
computedType, err := toMatcherTypes(matcher.GetType().String())
|
||||
if err != nil {
|
||||
return fmt.Errorf("unknown matcher type specified: %s", matcher.Type)
|
||||
}
|
||||
|
||||
matcher.matcherType = computedType
|
||||
// By default, match on body if user hasn't provided any specific items
|
||||
if m.Part == "" {
|
||||
m.Part = "body"
|
||||
if matcher.Part == "" {
|
||||
matcher.Part = "body"
|
||||
}
|
||||
|
||||
// Compile the regexes
|
||||
for _, regex := range m.Regex {
|
||||
for _, regex := range matcher.Regex {
|
||||
compiled, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return fmt.Errorf("could not compile regex: %s", regex)
|
||||
}
|
||||
m.regexCompiled = append(m.regexCompiled, compiled)
|
||||
matcher.regexCompiled = append(matcher.regexCompiled, compiled)
|
||||
}
|
||||
|
||||
// Compile and validate binary Values in matcher
|
||||
for _, value := range matcher.Binary {
|
||||
if decoded, err := hex.DecodeString(value); err != nil {
|
||||
return fmt.Errorf("could not hex decode binary: %s", value)
|
||||
} else {
|
||||
matcher.binaryDecoded = append(matcher.binaryDecoded, string(decoded))
|
||||
}
|
||||
}
|
||||
|
||||
// Compile the dsl expressions
|
||||
for _, expr := range m.DSL {
|
||||
for _, expr := range matcher.DSL {
|
||||
compiled, err := govaluate.NewEvaluableExpressionWithFunctions(expr, dsl.HelperFunctions())
|
||||
if err != nil {
|
||||
return fmt.Errorf("could not compile dsl: %s", expr)
|
||||
}
|
||||
m.dslCompiled = append(m.dslCompiled, compiled)
|
||||
matcher.dslCompiled = append(matcher.dslCompiled, compiled)
|
||||
}
|
||||
|
||||
// Set up the condition type, if any.
|
||||
if m.Condition != "" {
|
||||
m.condition, ok = ConditionTypes[m.Condition]
|
||||
if matcher.Condition != "" {
|
||||
matcher.condition, ok = ConditionTypes[matcher.Condition]
|
||||
if !ok {
|
||||
return fmt.Errorf("unknown condition specified: %s", m.Condition)
|
||||
return fmt.Errorf("unknown condition specified: %s", matcher.Condition)
|
||||
}
|
||||
} else {
|
||||
m.condition = ORCondition
|
||||
matcher.condition = ORCondition
|
||||
}
|
||||
|
||||
if matcher.CaseInsensitive {
|
||||
if matcher.GetType() != WordsMatcher {
|
||||
return fmt.Errorf("case-insensitive flag is supported only for 'word' matchers (not '%s')", matcher.Type)
|
||||
}
|
||||
for i := range matcher.Words {
|
||||
matcher.Words[i] = strings.ToLower(matcher.Words[i])
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -1,19 +1,20 @@
|
||||
package matchers
|
||||
|
||||
import (
|
||||
"encoding/hex"
|
||||
"strings"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/operators/common/dsl"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/expressions"
|
||||
)
|
||||
|
||||
// MatchStatusCode matches a status code check against a corpus
|
||||
func (m *Matcher) MatchStatusCode(statusCode int) bool {
|
||||
func (matcher *Matcher) MatchStatusCode(statusCode int) bool {
|
||||
// Iterate over all the status codes accepted as valid
|
||||
//
|
||||
// Status codes don't support AND conditions.
|
||||
for _, status := range m.Status {
|
||||
for _, status := range matcher.Status {
|
||||
// Continue if the status codes don't match
|
||||
if statusCode != status {
|
||||
continue
|
||||
@ -25,11 +26,11 @@ func (m *Matcher) MatchStatusCode(statusCode int) bool {
|
||||
}
|
||||
|
||||
// MatchSize matches a size check against a corpus
|
||||
func (m *Matcher) MatchSize(length int) bool {
|
||||
func (matcher *Matcher) MatchSize(length int) bool {
|
||||
// Iterate over all the sizes accepted as valid
|
||||
//
|
||||
// Sizes codes don't support AND conditions.
|
||||
for _, size := range m.Size {
|
||||
for _, size := range matcher.Size {
|
||||
// Continue if the size doesn't match
|
||||
if length != size {
|
||||
continue
|
||||
@ -41,16 +42,20 @@ func (m *Matcher) MatchSize(length int) bool {
|
||||
}
|
||||
|
||||
// MatchWords matches a word check against a corpus.
|
||||
func (m *Matcher) MatchWords(corpus string, dynamicValues map[string]interface{}) (bool, []string) {
|
||||
func (matcher *Matcher) MatchWords(corpus string, data map[string]interface{}) (bool, []string) {
|
||||
if matcher.CaseInsensitive {
|
||||
corpus = strings.ToLower(corpus)
|
||||
}
|
||||
|
||||
var matchedWords []string
|
||||
// Iterate over all the words accepted as valid
|
||||
for i, word := range m.Words {
|
||||
if dynamicValues == nil {
|
||||
dynamicValues = make(map[string]interface{})
|
||||
for i, word := range matcher.Words {
|
||||
if data == nil {
|
||||
data = make(map[string]interface{})
|
||||
}
|
||||
|
||||
var err error
|
||||
word, err = expressions.Evaluate(word, dynamicValues)
|
||||
word, err = expressions.Evaluate(word, data)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
@ -58,7 +63,7 @@ func (m *Matcher) MatchWords(corpus string, dynamicValues map[string]interface{}
|
||||
if !strings.Contains(corpus, word) {
|
||||
// If we are in an AND request and a match failed,
|
||||
// return false as the AND condition fails on any single mismatch.
|
||||
if m.condition == ANDCondition {
|
||||
if matcher.condition == ANDCondition {
|
||||
return false, []string{}
|
||||
}
|
||||
// Continue with the flow since it's an OR Condition.
|
||||
@ -66,14 +71,14 @@ func (m *Matcher) MatchWords(corpus string, dynamicValues map[string]interface{}
|
||||
}
|
||||
|
||||
// If the condition was an OR, return on the first match.
|
||||
if m.condition == ORCondition {
|
||||
if matcher.condition == ORCondition {
|
||||
return true, []string{word}
|
||||
}
|
||||
|
||||
matchedWords = append(matchedWords, word)
|
||||
|
||||
// If we are at the end of the words, return with true
|
||||
if len(m.Words)-1 == i {
|
||||
if len(matcher.Words)-1 == i {
|
||||
return true, matchedWords
|
||||
}
|
||||
}
|
||||
@ -81,15 +86,15 @@ func (m *Matcher) MatchWords(corpus string, dynamicValues map[string]interface{}
|
||||
}
|
||||
|
||||
// MatchRegex matches a regex check against a corpus
|
||||
func (m *Matcher) MatchRegex(corpus string) (bool, []string) {
|
||||
func (matcher *Matcher) MatchRegex(corpus string) (bool, []string) {
|
||||
var matchedRegexes []string
|
||||
// Iterate over all the regexes accepted as valid
|
||||
for i, regex := range m.regexCompiled {
|
||||
for i, regex := range matcher.regexCompiled {
|
||||
// Continue if the regex doesn't match
|
||||
if !regex.MatchString(corpus) {
|
||||
// If we are in an AND request and a match failed,
|
||||
// return false as the AND condition fails on any single mismatch.
|
||||
if m.condition == ANDCondition {
|
||||
if matcher.condition == ANDCondition {
|
||||
return false, []string{}
|
||||
}
|
||||
// Continue with the flow since it's an OR Condition.
|
||||
@ -98,14 +103,14 @@ func (m *Matcher) MatchRegex(corpus string) (bool, []string) {
|
||||
|
||||
currentMatches := regex.FindAllString(corpus, -1)
|
||||
// If the condition was an OR, return on the first match.
|
||||
if m.condition == ORCondition {
|
||||
if matcher.condition == ORCondition {
|
||||
return true, currentMatches
|
||||
}
|
||||
|
||||
matchedRegexes = append(matchedRegexes, currentMatches...)
|
||||
|
||||
// If we are at the end of the regex, return with true
|
||||
if len(m.regexCompiled)-1 == i {
|
||||
if len(matcher.regexCompiled)-1 == i {
|
||||
return true, matchedRegexes
|
||||
}
|
||||
}
|
||||
@ -113,23 +118,14 @@ func (m *Matcher) MatchRegex(corpus string) (bool, []string) {
|
||||
}
|
||||
|
||||
// MatchBinary matches a binary check against a corpus
|
||||
func (m *Matcher) MatchBinary(corpus string) (bool, []string) {
|
||||
func (matcher *Matcher) MatchBinary(corpus string) (bool, []string) {
|
||||
var matchedBinary []string
|
||||
// Iterate over all the words accepted as valid
|
||||
for i, binary := range m.Binary {
|
||||
// Continue if the word doesn't match
|
||||
hexa, err := hex.DecodeString(binary)
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("Could not hex encode the given binary matcher value: '%s'", binary)
|
||||
if m.condition == ANDCondition {
|
||||
return false, []string{}
|
||||
}
|
||||
continue
|
||||
}
|
||||
if !strings.Contains(corpus, string(hexa)) {
|
||||
for i, binary := range matcher.binaryDecoded {
|
||||
if !strings.Contains(corpus, binary) {
|
||||
// If we are in an AND request and a match failed,
|
||||
// return false as the AND condition fails on any single mismatch.
|
||||
if m.condition == ANDCondition {
|
||||
if matcher.condition == ANDCondition {
|
||||
return false, []string{}
|
||||
}
|
||||
// Continue with the flow since it's an OR Condition.
|
||||
@ -137,14 +133,14 @@ func (m *Matcher) MatchBinary(corpus string) (bool, []string) {
|
||||
}
|
||||
|
||||
// If the condition was an OR, return on the first match.
|
||||
if m.condition == ORCondition {
|
||||
return true, []string{string(hexa)}
|
||||
if matcher.condition == ORCondition {
|
||||
return true, []string{binary}
|
||||
}
|
||||
|
||||
matchedBinary = append(matchedBinary, string(hexa))
|
||||
matchedBinary = append(matchedBinary, binary)
|
||||
|
||||
// If we are at the end of the words, return with true
|
||||
if len(m.Binary)-1 == i {
|
||||
if len(matcher.Binary)-1 == i {
|
||||
return true, matchedBinary
|
||||
}
|
||||
}
|
||||
@ -152,9 +148,21 @@ func (m *Matcher) MatchBinary(corpus string) (bool, []string) {
|
||||
}
|
||||
|
||||
// MatchDSL matches on a generic map result
|
||||
func (m *Matcher) MatchDSL(data map[string]interface{}) bool {
|
||||
func (matcher *Matcher) MatchDSL(data map[string]interface{}) bool {
|
||||
// Iterate over all the expressions accepted as valid
|
||||
for i, expression := range m.dslCompiled {
|
||||
for i, expression := range matcher.dslCompiled {
|
||||
if varErr := expressions.ContainsUnresolvedVariables(expression.String()); varErr != nil {
|
||||
resolvedExpression, err := expressions.Evaluate(expression.String(), data)
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("Could not evaluate expression: %s, error: %s", matcher.Name, err.Error())
|
||||
return false
|
||||
}
|
||||
expression, err = govaluate.NewEvaluableExpressionWithFunctions(resolvedExpression, dsl.HelperFunctions())
|
||||
if err != nil {
|
||||
gologger.Warning().Msgf("Could not evaluate expression: %s, error: %s", matcher.Name, err.Error())
|
||||
return false
|
||||
}
|
||||
}
|
||||
result, err := expression.Evaluate(data)
|
||||
if err != nil {
|
||||
continue
|
||||
@ -167,7 +175,7 @@ func (m *Matcher) MatchDSL(data map[string]interface{}) bool {
|
||||
if !ok || !bResult {
|
||||
// If we are in an AND request and a match failed,
|
||||
// return false as the AND condition fails on any single mismatch.
|
||||
if m.condition == ANDCondition {
|
||||
if matcher.condition == ANDCondition {
|
||||
return false
|
||||
}
|
||||
// Continue with the flow since it's an OR Condition.
|
||||
@ -175,12 +183,12 @@ func (m *Matcher) MatchDSL(data map[string]interface{}) bool {
|
||||
}
|
||||
|
||||
// If the condition was an OR, return on the first match.
|
||||
if m.condition == ORCondition {
|
||||
if matcher.condition == ORCondition {
|
||||
return true
|
||||
}
|
||||
|
||||
// If we are at the end of the dsl, return with true
|
||||
if len(m.dslCompiled)-1 == i {
|
||||
if len(matcher.dslCompiled)-1 == i {
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
@ -3,6 +3,8 @@ package matchers
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/operators/common/dsl"
|
||||
"github.com/stretchr/testify/require"
|
||||
)
|
||||
|
||||
@ -19,7 +21,7 @@ func TestWordANDCondition(t *testing.T) {
|
||||
}
|
||||
|
||||
func TestRegexANDCondition(t *testing.T) {
|
||||
m := &Matcher{Type: "regex", Condition: "and", Regex: []string{"[a-z]{3}", "\\d{2}"}}
|
||||
m := &Matcher{Type: MatcherTypeHolder{MatcherType: RegexMatcher}, Condition: "and", Regex: []string{"[a-z]{3}", "\\d{2}"}}
|
||||
err := m.CompileMatchers()
|
||||
require.Nil(t, err)
|
||||
|
||||
@ -49,7 +51,7 @@ func TestORCondition(t *testing.T) {
|
||||
}
|
||||
|
||||
func TestRegexOrCondition(t *testing.T) {
|
||||
m := &Matcher{Type: "regex", Condition: "or", Regex: []string{"[a-z]{3}", "\\d{2}"}}
|
||||
m := &Matcher{Type: MatcherTypeHolder{MatcherType: RegexMatcher}, Condition: "or", Regex: []string{"[a-z]{3}", "\\d{2}"}}
|
||||
err := m.CompileMatchers()
|
||||
require.Nil(t, err)
|
||||
|
||||
@ -63,7 +65,7 @@ func TestRegexOrCondition(t *testing.T) {
|
||||
}
|
||||
|
||||
func TestHexEncoding(t *testing.T) {
|
||||
m := &Matcher{Encoding: "hex", Type: "word", Part: "body", Words: []string{"50494e47"}}
|
||||
m := &Matcher{Encoding: "hex", Type: MatcherTypeHolder{MatcherType: WordsMatcher}, Part: "body", Words: []string{"50494e47"}}
|
||||
err := m.CompileMatchers()
|
||||
require.Nil(t, err, "could not compile matcher")
|
||||
|
||||
@ -71,3 +73,19 @@ func TestHexEncoding(t *testing.T) {
|
||||
require.True(t, isMatched, "Could not match valid Hex condition")
|
||||
require.Equal(t, m.Words, matched)
|
||||
}
|
||||
|
||||
func TestMatcher_MatchDSL(t *testing.T) {
|
||||
compiled, err := govaluate.NewEvaluableExpressionWithFunctions("contains(body, \"{{VARIABLE}}\")", dsl.HelperFunctions())
|
||||
require.Nil(t, err, "couldn't compile expression")
|
||||
|
||||
m := &Matcher{Type: MatcherTypeHolder{MatcherType: DSLMatcher}, dslCompiled: []*govaluate.EvaluableExpression{compiled}}
|
||||
err = m.CompileMatchers()
|
||||
require.Nil(t, err, "could not compile matcher")
|
||||
|
||||
values := []string{"PING", "pong"}
|
||||
|
||||
for value := range values {
|
||||
isMatched := m.MatchDSL(map[string]interface{}{"body": value, "VARIABLE": value})
|
||||
require.True(t, isMatched)
|
||||
}
|
||||
}
|
||||
|
||||
@ -10,14 +10,7 @@ import (
|
||||
type Matcher struct {
|
||||
// description: |
|
||||
// Type is the type of the matcher.
|
||||
// values:
|
||||
// - "status"
|
||||
// - "size"
|
||||
// - "word"
|
||||
// - "regex"
|
||||
// - "binary"
|
||||
// - "dsl"
|
||||
Type string `yaml:"type" jsonschema:"title=type of matcher,description=Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"`
|
||||
Type MatcherTypeHolder `yaml:"type" jsonschema:"title=type of matcher,description=Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"`
|
||||
// description: |
|
||||
// Condition is the optional condition between two matcher variables. By default,
|
||||
// the condition is assumed to be OR.
|
||||
@ -62,7 +55,7 @@ type Matcher struct {
|
||||
// description: |
|
||||
// Words contains word patterns required to be present in the response part.
|
||||
// examples:
|
||||
// - name: Match for outlook mail protection domain
|
||||
// - name: Match for Outlook mail protection domain
|
||||
// value: >
|
||||
// []string{"mail.protection.outlook.com"}
|
||||
// - name: Match for application/json in response headers
|
||||
@ -105,42 +98,21 @@ type Matcher struct {
|
||||
// values:
|
||||
// - "hex"
|
||||
Encoding string `yaml:"encoding,omitempty" jsonschema:"title=encoding for word field,description=Optional encoding for the word fields,enum=hex"`
|
||||
// description: |
|
||||
// CaseInsensitive enables case-insensitive matches. Default is false.
|
||||
// values:
|
||||
// - false
|
||||
// - true
|
||||
CaseInsensitive bool `yaml:"case-insensitive,omitempty" jsonschema:"title=use case insensitive match,description=use case insensitive match"`
|
||||
|
||||
// cached data for the compiled matcher
|
||||
condition ConditionType
|
||||
matcherType MatcherType
|
||||
binaryDecoded []string
|
||||
regexCompiled []*regexp.Regexp
|
||||
dslCompiled []*govaluate.EvaluableExpression
|
||||
}
|
||||
|
||||
// MatcherType is the type of the matcher specified
|
||||
type MatcherType = int
|
||||
|
||||
const (
|
||||
// WordsMatcher matches responses with words
|
||||
WordsMatcher MatcherType = iota + 1
|
||||
// RegexMatcher matches responses with regexes
|
||||
RegexMatcher
|
||||
// BinaryMatcher matches responses with words
|
||||
BinaryMatcher
|
||||
// StatusMatcher matches responses with status codes
|
||||
StatusMatcher
|
||||
// SizeMatcher matches responses with response size
|
||||
SizeMatcher
|
||||
// DSLMatcher matches based upon dsl syntax
|
||||
DSLMatcher
|
||||
)
|
||||
|
||||
// MatcherTypes is a table for conversion of matcher type from string.
|
||||
var MatcherTypes = map[string]MatcherType{
|
||||
"status": StatusMatcher,
|
||||
"size": SizeMatcher,
|
||||
"word": WordsMatcher,
|
||||
"regex": RegexMatcher,
|
||||
"binary": BinaryMatcher,
|
||||
"dsl": DSLMatcher,
|
||||
}
|
||||
|
||||
// ConditionType is the type of condition for matcher
|
||||
type ConditionType int
|
||||
|
||||
@ -158,22 +130,17 @@ var ConditionTypes = map[string]ConditionType{
|
||||
}
|
||||
|
||||
// Result reverts the results of the match if the matcher is of type negative.
|
||||
func (m *Matcher) Result(data bool) bool {
|
||||
if m.Negative {
|
||||
func (matcher *Matcher) Result(data bool) bool {
|
||||
if matcher.Negative {
|
||||
return !data
|
||||
}
|
||||
return data
|
||||
}
|
||||
|
||||
// ResultWithMatchedSnippet returns true and the matched snippet, or false and an empty string
|
||||
func (m *Matcher) ResultWithMatchedSnippet(data bool, matchedSnippet []string) (bool, []string) {
|
||||
if m.Negative {
|
||||
func (matcher *Matcher) ResultWithMatchedSnippet(data bool, matchedSnippet []string) (bool, []string) {
|
||||
if matcher.Negative {
|
||||
return !data, []string{}
|
||||
}
|
||||
return data, matchedSnippet
|
||||
}
|
||||
|
||||
// GetType returns the type of the matcher
|
||||
func (m *Matcher) GetType() MatcherType {
|
||||
return m.matcherType
|
||||
}
|
||||
|
||||
115
v2/pkg/operators/matchers/matchers_types.go
Normal file
115
v2/pkg/operators/matchers/matchers_types.go
Normal file
@ -0,0 +1,115 @@
|
||||
package matchers
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
)
|
||||
|
||||
// MatcherType is the type of the matcher specified
|
||||
type MatcherType int
|
||||
|
||||
// name:MatcherType
|
||||
const (
|
||||
// name:word
|
||||
WordsMatcher MatcherType = iota + 1
|
||||
// name:regex
|
||||
RegexMatcher
|
||||
// name:binary
|
||||
BinaryMatcher
|
||||
// name:status
|
||||
StatusMatcher
|
||||
// name:size
|
||||
SizeMatcher
|
||||
// name:dsl
|
||||
DSLMatcher
|
||||
limit
|
||||
)
|
||||
|
||||
// MatcherTypes is a table for conversion of matcher type from string.
|
||||
var MatcherTypes = map[MatcherType]string{
|
||||
StatusMatcher: "status",
|
||||
SizeMatcher: "size",
|
||||
WordsMatcher: "word",
|
||||
RegexMatcher: "regex",
|
||||
BinaryMatcher: "binary",
|
||||
DSLMatcher: "dsl",
|
||||
}
|
||||
|
||||
//GetType returns the type of the matcher
|
||||
func (matcher *Matcher) GetType() MatcherType {
|
||||
return matcher.Type.MatcherType
|
||||
}
|
||||
|
||||
// GetSupportedMatcherTypes returns list of supported types
|
||||
func GetSupportedMatcherTypes() []MatcherType {
|
||||
var result []MatcherType
|
||||
for index := MatcherType(1); index < limit; index++ {
|
||||
result = append(result, index)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func toMatcherTypes(valueToMap string) (MatcherType, error) {
|
||||
normalizedValue := normalizeValue(valueToMap)
|
||||
for key, currentValue := range MatcherTypes {
|
||||
if normalizedValue == currentValue {
|
||||
return key, nil
|
||||
}
|
||||
}
|
||||
return -1, errors.New("Invalid matcher type: " + valueToMap)
|
||||
}
|
||||
|
||||
func normalizeValue(value string) string {
|
||||
return strings.TrimSpace(strings.ToLower(value))
|
||||
}
|
||||
|
||||
func (t MatcherType) String() string {
|
||||
return MatcherTypes[t]
|
||||
}
|
||||
|
||||
// MatcherTypeHolder is used to hold internal type of the matcher
|
||||
type MatcherTypeHolder struct {
|
||||
MatcherType MatcherType `mapping:"true"`
|
||||
}
|
||||
|
||||
func (t MatcherTypeHolder) String() string {
|
||||
return t.MatcherType.String()
|
||||
}
|
||||
|
||||
func (holder MatcherTypeHolder) JSONSchemaType() *jsonschema.Type {
|
||||
gotType := &jsonschema.Type{
|
||||
Type: "string",
|
||||
Title: "type of the matcher",
|
||||
Description: "Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl",
|
||||
}
|
||||
for _, types := range GetSupportedMatcherTypes() {
|
||||
gotType.Enum = append(gotType.Enum, types.String())
|
||||
}
|
||||
return gotType
|
||||
}
|
||||
|
||||
func (holder *MatcherTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
|
||||
var marshalledTypes string
|
||||
if err := unmarshal(&marshalledTypes); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
computedType, err := toMatcherTypes(marshalledTypes)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
holder.MatcherType = computedType
|
||||
return nil
|
||||
}
|
||||
|
||||
func (holder MatcherTypeHolder) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(holder.MatcherType.String())
|
||||
}
|
||||
|
||||
func (holder MatcherTypeHolder) MarshalYAML() (interface{}, error) {
|
||||
return holder.MatcherType.String(), nil
|
||||
}
|
||||
@ -72,11 +72,64 @@ type Result struct {
|
||||
// OutputExtracts is the list of extracts to be displayed on screen.
|
||||
OutputExtracts []string
|
||||
// DynamicValues contains any dynamic values to be templated
|
||||
DynamicValues map[string]interface{}
|
||||
DynamicValues map[string][]string
|
||||
// PayloadValues contains payload values provided by user. (Optional)
|
||||
PayloadValues map[string]interface{}
|
||||
}
|
||||
|
||||
// MakeDynamicValuesCallback takes an input dynamic values map and calls
|
||||
// the callback function with all variations of the data in input in form
|
||||
// of map[string]string (interface{}).
|
||||
func MakeDynamicValuesCallback(input map[string][]string, iterateAllValues bool, callback func(map[string]interface{}) bool) {
|
||||
output := make(map[string]interface{}, len(input))
|
||||
|
||||
if !iterateAllValues {
|
||||
for k, v := range input {
|
||||
if len(v) > 0 {
|
||||
output[k] = v[0]
|
||||
}
|
||||
}
|
||||
callback(output)
|
||||
return
|
||||
}
|
||||
inputIndex := make(map[string]int, len(input))
|
||||
|
||||
var maxValue int
|
||||
for _, v := range input {
|
||||
if len(v) > maxValue {
|
||||
maxValue = len(v)
|
||||
}
|
||||
}
|
||||
|
||||
for i := 0; i < maxValue; i++ {
|
||||
for k, v := range input {
|
||||
if len(v) == 0 {
|
||||
continue
|
||||
}
|
||||
if len(v) == 1 {
|
||||
output[k] = v[0]
|
||||
continue
|
||||
}
|
||||
if gotIndex, ok := inputIndex[k]; !ok {
|
||||
inputIndex[k] = 0
|
||||
output[k] = v[0]
|
||||
} else {
|
||||
newIndex := gotIndex + 1
|
||||
if newIndex >= len(v) {
|
||||
output[k] = v[len(v)-1]
|
||||
continue
|
||||
}
|
||||
output[k] = v[newIndex]
|
||||
inputIndex[k] = newIndex
|
||||
}
|
||||
}
|
||||
// skip if the callback says so
|
||||
if callback(output) {
|
||||
return
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Merge merges a result structure into the other.
|
||||
func (r *Result) Merge(result *Result) {
|
||||
if !r.Matched && result.Matched {
|
||||
@ -115,7 +168,7 @@ func (operators *Operators) Execute(data map[string]interface{}, match MatchFunc
|
||||
result := &Result{
|
||||
Matches: make(map[string][]string),
|
||||
Extracts: make(map[string][]string),
|
||||
DynamicValues: make(map[string]interface{}),
|
||||
DynamicValues: make(map[string][]string),
|
||||
}
|
||||
|
||||
// Start with the extractors first and evaluate them.
|
||||
@ -126,8 +179,10 @@ func (operators *Operators) Execute(data map[string]interface{}, match MatchFunc
|
||||
extractorResults = append(extractorResults, match)
|
||||
|
||||
if extractor.Internal {
|
||||
if _, ok := result.DynamicValues[extractor.Name]; !ok {
|
||||
result.DynamicValues[extractor.Name] = match
|
||||
if data, ok := result.DynamicValues[extractor.Name]; !ok {
|
||||
result.DynamicValues[extractor.Name] = []string{match}
|
||||
} else {
|
||||
result.DynamicValues[extractor.Name] = append(data, match)
|
||||
}
|
||||
} else {
|
||||
result.OutputExtracts = append(result.OutputExtracts, match)
|
||||
@ -179,7 +234,7 @@ func getMatcherName(matcher *matchers.Matcher, matcherIndex int) string {
|
||||
if matcher.Name != "" {
|
||||
return matcher.Name
|
||||
} else {
|
||||
return matcher.Type + "-" + strconv.Itoa(matcherIndex+1) // making the index start from 1 to be more readable
|
||||
return matcher.Type.String() + "-" + strconv.Itoa(matcherIndex+1) // making the index start from 1 to be more readable
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
57
v2/pkg/operators/operators_test.go
Normal file
57
v2/pkg/operators/operators_test.go
Normal file
@ -0,0 +1,57 @@
|
||||
package operators
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/require"
|
||||
)
|
||||
|
||||
func TestMakeDynamicValuesCallback(t *testing.T) {
|
||||
input := map[string][]string{
|
||||
"a": []string{"1", "2"},
|
||||
"b": []string{"3"},
|
||||
"c": []string{},
|
||||
"d": []string{"A", "B", "C"},
|
||||
}
|
||||
|
||||
count := 0
|
||||
MakeDynamicValuesCallback(input, true, func(data map[string]interface{}) bool {
|
||||
count++
|
||||
require.Len(t, data, 3, "could not get correct output length")
|
||||
return false
|
||||
})
|
||||
require.Equal(t, 3, count, "could not get correct result count")
|
||||
|
||||
t.Run("all", func(t *testing.T) {
|
||||
input := map[string][]string{
|
||||
"a": []string{"1"},
|
||||
"b": []string{"2"},
|
||||
"c": []string{"3"},
|
||||
}
|
||||
|
||||
count := 0
|
||||
MakeDynamicValuesCallback(input, true, func(data map[string]interface{}) bool {
|
||||
count++
|
||||
require.Len(t, data, 3, "could not get correct output length")
|
||||
return false
|
||||
})
|
||||
require.Equal(t, 1, count, "could not get correct result count")
|
||||
})
|
||||
|
||||
t.Run("first", func(t *testing.T) {
|
||||
input := map[string][]string{
|
||||
"a": []string{"1", "2"},
|
||||
"b": []string{"3"},
|
||||
"c": []string{},
|
||||
"d": []string{"A", "B", "C"},
|
||||
}
|
||||
|
||||
count := 0
|
||||
MakeDynamicValuesCallback(input, false, func(data map[string]interface{}) bool {
|
||||
count++
|
||||
require.Len(t, data, 3, "could not get correct output length")
|
||||
return false
|
||||
})
|
||||
require.Equal(t, 1, count, "could not get correct result count")
|
||||
})
|
||||
}
|
||||
@ -2,11 +2,13 @@ package output
|
||||
|
||||
import (
|
||||
"os"
|
||||
"sync"
|
||||
)
|
||||
|
||||
// fileWriter is a concurrent file based output writer.
|
||||
type fileWriter struct {
|
||||
file *os.File
|
||||
mu sync.Mutex
|
||||
}
|
||||
|
||||
// NewFileOutputWriter creates a new buffered writer for a file
|
||||
@ -19,16 +21,22 @@ func newFileOutputWriter(file string) (*fileWriter, error) {
|
||||
}
|
||||
|
||||
// WriteString writes an output to the underlying file
|
||||
func (w *fileWriter) Write(data []byte) error {
|
||||
func (w *fileWriter) Write(data []byte) (int, error) {
|
||||
w.mu.Lock()
|
||||
defer w.mu.Unlock()
|
||||
if _, err := w.file.Write(data); err != nil {
|
||||
return err
|
||||
return 0, err
|
||||
}
|
||||
_, err := w.file.Write([]byte("\n"))
|
||||
return err
|
||||
if _, err := w.file.Write([]byte("\n")); err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return len(data) + 1, nil
|
||||
}
|
||||
|
||||
// Close closes the underlying writer flushing everything to disk
|
||||
func (w *fileWriter) Close() error {
|
||||
w.mu.Lock()
|
||||
defer w.mu.Unlock()
|
||||
//nolint:errcheck // we don't care whether sync failed or succeeded.
|
||||
w.file.Sync()
|
||||
return w.file.Close()
|
||||
|
||||
@ -27,6 +27,15 @@ func (w *StandardWriter) formatScreen(output *ResultEvent) []byte {
|
||||
builder.WriteString(w.aurora.BrightGreen(output.ExtractorName).Bold().String())
|
||||
}
|
||||
|
||||
if w.matcherStatus {
|
||||
builder.WriteString("] [")
|
||||
if !output.MatcherStatus {
|
||||
builder.WriteString(w.aurora.Red("failed").String())
|
||||
} else {
|
||||
builder.WriteString(w.aurora.Green("matched").String())
|
||||
}
|
||||
}
|
||||
|
||||
builder.WriteString("] [")
|
||||
builder.WriteString(w.aurora.BrightBlue(output.Type).String())
|
||||
builder.WriteString("] ")
|
||||
@ -35,7 +44,11 @@ func (w *StandardWriter) formatScreen(output *ResultEvent) []byte {
|
||||
builder.WriteString(w.severityColors(output.Info.SeverityHolder.Severity))
|
||||
builder.WriteString("] ")
|
||||
}
|
||||
if output.Matched != "" {
|
||||
builder.WriteString(output.Matched)
|
||||
} else {
|
||||
builder.WriteString(output.Host)
|
||||
}
|
||||
|
||||
// If any extractors, write the results
|
||||
if len(output.ExtractedResults) > 0 {
|
||||
|
||||
@ -1,9 +1,9 @@
|
||||
package output
|
||||
|
||||
import (
|
||||
"io"
|
||||
"os"
|
||||
"regexp"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
@ -16,6 +16,8 @@ import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
|
||||
)
|
||||
|
||||
// Writer is an interface which writes output to somewhere for nuclei events.
|
||||
@ -26,6 +28,8 @@ type Writer interface {
|
||||
Colorizer() aurora.Aurora
|
||||
// Write writes the event to file and/or screen.
|
||||
Write(*ResultEvent) error
|
||||
// WriteFailure writes the optional failure event for template to file and/or screen.
|
||||
WriteFailure(event InternalEvent) error
|
||||
// Request logs a request in the trace log
|
||||
Request(templateID, url, requestType string, err error)
|
||||
}
|
||||
@ -36,11 +40,11 @@ type StandardWriter struct {
|
||||
jsonReqResp bool
|
||||
noTimestamp bool
|
||||
noMetadata bool
|
||||
matcherStatus bool
|
||||
aurora aurora.Aurora
|
||||
outputFile *fileWriter
|
||||
outputMutex *sync.Mutex
|
||||
traceFile *fileWriter
|
||||
traceMutex *sync.Mutex
|
||||
outputFile io.WriteCloser
|
||||
traceFile io.WriteCloser
|
||||
errorFile io.WriteCloser
|
||||
severityColors func(severity.Severity) string
|
||||
}
|
||||
|
||||
@ -54,10 +58,16 @@ type InternalWrappedEvent struct {
|
||||
InternalEvent InternalEvent
|
||||
Results []*ResultEvent
|
||||
OperatorsResult *operators.Result
|
||||
UsesInteractsh bool
|
||||
}
|
||||
|
||||
// ResultEvent is a wrapped result event for a single nuclei output.
|
||||
type ResultEvent struct {
|
||||
// Template is the relative filename for the template
|
||||
Template string `json:"template,omitempty"`
|
||||
// TemplateURL is the URL of the template for the result inside the nuclei
|
||||
// templates repository if it belongs to the repository.
|
||||
TemplateURL string `json:"template-url,omitempty"`
|
||||
// TemplateID is the ID of the template for the result.
|
||||
TemplateID string `json:"template-id"`
|
||||
// TemplatePath is the path of template
|
||||
@ -93,14 +103,16 @@ type ResultEvent struct {
|
||||
// CURLCommand is an optional curl command to reproduce the request
|
||||
// Only applicable if the report is for HTTP.
|
||||
CURLCommand string `json:"curl-command,omitempty"`
|
||||
// MatcherStatus is the status of the match
|
||||
MatcherStatus bool `json:"matcher-status"`
|
||||
FileToIndexPosition map[string]int `json:"-"`
|
||||
}
|
||||
|
||||
// NewStandardWriter creates a new output writer based on user configurations
|
||||
func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool, file, traceFile string) (*StandardWriter, error) {
|
||||
func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp, MatcherStatus bool, file, traceFile string, errorFile string) (*StandardWriter, error) {
|
||||
auroraColorizer := aurora.NewAurora(colors)
|
||||
|
||||
var outputFile *fileWriter
|
||||
var outputFile io.WriteCloser
|
||||
if file != "" {
|
||||
output, err := newFileOutputWriter(file)
|
||||
if err != nil {
|
||||
@ -108,7 +120,7 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
|
||||
}
|
||||
outputFile = output
|
||||
}
|
||||
var traceOutput *fileWriter
|
||||
var traceOutput io.WriteCloser
|
||||
if traceFile != "" {
|
||||
output, err := newFileOutputWriter(traceFile)
|
||||
if err != nil {
|
||||
@ -116,16 +128,24 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
|
||||
}
|
||||
traceOutput = output
|
||||
}
|
||||
var errorOutput io.WriteCloser
|
||||
if errorFile != "" {
|
||||
output, err := newFileOutputWriter(errorFile)
|
||||
if err != nil {
|
||||
return nil, errors.Wrap(err, "could not create error file")
|
||||
}
|
||||
errorOutput = output
|
||||
}
|
||||
writer := &StandardWriter{
|
||||
json: json,
|
||||
jsonReqResp: jsonReqResp,
|
||||
noMetadata: noMetadata,
|
||||
matcherStatus: MatcherStatus,
|
||||
noTimestamp: noTimestamp,
|
||||
aurora: auroraColorizer,
|
||||
outputFile: outputFile,
|
||||
outputMutex: &sync.Mutex{},
|
||||
traceFile: traceOutput,
|
||||
traceMutex: &sync.Mutex{},
|
||||
errorFile: errorOutput,
|
||||
severityColors: colorizer.New(auroraColorizer),
|
||||
}
|
||||
return writer, nil
|
||||
@ -133,6 +153,10 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
|
||||
|
||||
// Write writes the event to file and/or screen.
|
||||
func (w *StandardWriter) Write(event *ResultEvent) error {
|
||||
// Enrich the result event with extra metadata on the template-path and url.
|
||||
if event.TemplatePath != "" {
|
||||
event.Template, event.TemplateURL = utils.TemplatePathURL(types.ToString(event.TemplatePath))
|
||||
}
|
||||
event.Timestamp = time.Now()
|
||||
|
||||
var data []byte
|
||||
@ -155,33 +179,33 @@ func (w *StandardWriter) Write(event *ResultEvent) error {
|
||||
if !w.json {
|
||||
data = decolorizerRegex.ReplaceAll(data, []byte(""))
|
||||
}
|
||||
if writeErr := w.outputFile.Write(data); writeErr != nil {
|
||||
if _, writeErr := w.outputFile.Write(data); writeErr != nil {
|
||||
return errors.Wrap(err, "could not write to output")
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// JSONTraceRequest is a trace log request written to file
|
||||
type JSONTraceRequest struct {
|
||||
ID string `json:"id"`
|
||||
URL string `json:"url"`
|
||||
// JSONLogRequest is a trace/error log request written to file
|
||||
type JSONLogRequest struct {
|
||||
Template string `json:"template"`
|
||||
Input string `json:"input"`
|
||||
Error string `json:"error"`
|
||||
Type string `json:"type"`
|
||||
}
|
||||
|
||||
// Request writes a log the requests trace log
|
||||
func (w *StandardWriter) Request(templateID, url, requestType string, err error) {
|
||||
if w.traceFile == nil {
|
||||
func (w *StandardWriter) Request(templatePath, input, requestType string, requestErr error) {
|
||||
if w.traceFile == nil && w.errorFile == nil {
|
||||
return
|
||||
}
|
||||
request := &JSONTraceRequest{
|
||||
ID: templateID,
|
||||
URL: url,
|
||||
request := &JSONLogRequest{
|
||||
Template: templatePath,
|
||||
Input: input,
|
||||
Type: requestType,
|
||||
}
|
||||
if err != nil {
|
||||
request.Error = err.Error()
|
||||
if unwrappedErr := utils.UnwrapError(requestErr); unwrappedErr != nil {
|
||||
request.Error = unwrappedErr.Error()
|
||||
} else {
|
||||
request.Error = "none"
|
||||
}
|
||||
@ -190,9 +214,14 @@ func (w *StandardWriter) Request(templateID, url, requestType string, err error)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
w.traceMutex.Lock()
|
||||
_ = w.traceFile.Write(data)
|
||||
w.traceMutex.Unlock()
|
||||
|
||||
if w.traceFile != nil {
|
||||
_, _ = w.traceFile.Write(data)
|
||||
}
|
||||
|
||||
if requestErr != nil && w.errorFile != nil {
|
||||
_, _ = w.errorFile.Write(data)
|
||||
}
|
||||
}
|
||||
|
||||
// Colorizer returns the colorizer instance for writer
|
||||
@ -208,4 +237,27 @@ func (w *StandardWriter) Close() {
|
||||
if w.traceFile != nil {
|
||||
w.traceFile.Close()
|
||||
}
|
||||
if w.errorFile != nil {
|
||||
w.errorFile.Close()
|
||||
}
|
||||
}
|
||||
|
||||
// WriteFailure writes the failure event for template to file and/or screen.
|
||||
func (w *StandardWriter) WriteFailure(event InternalEvent) error {
|
||||
if !w.matcherStatus {
|
||||
return nil
|
||||
}
|
||||
templatePath, templateURL := utils.TemplatePathURL(types.ToString(event["template-path"]))
|
||||
data := &ResultEvent{
|
||||
Template: templatePath,
|
||||
TemplateURL: templateURL,
|
||||
TemplateID: types.ToString(event["template-id"]),
|
||||
TemplatePath: types.ToString(event["template-path"]),
|
||||
Info: event["template-info"].(model.Info),
|
||||
Type: types.ToString(event["type"]),
|
||||
Host: types.ToString(event["host"]),
|
||||
MatcherStatus: false,
|
||||
Timestamp: time.Now(),
|
||||
}
|
||||
return w.Write(data)
|
||||
}
|
||||
|
||||
59
v2/pkg/output/output_test.go
Normal file
59
v2/pkg/output/output_test.go
Normal file
@ -0,0 +1,59 @@
|
||||
package output
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/pkg/errors"
|
||||
"github.com/stretchr/testify/require"
|
||||
)
|
||||
|
||||
func TestStandardWriterRequest(t *testing.T) {
|
||||
t.Run("WithoutTraceAndError", func(t *testing.T) {
|
||||
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
|
||||
require.NoError(t, err)
|
||||
require.NotPanics(t, func() {
|
||||
w.Request("path", "input", "http", nil)
|
||||
w.Close()
|
||||
})
|
||||
})
|
||||
|
||||
t.Run("TraceAndErrorWithoutError", func(t *testing.T) {
|
||||
traceWriter := &testWriteCloser{}
|
||||
errorWriter := &testWriteCloser{}
|
||||
|
||||
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
|
||||
w.traceFile = traceWriter
|
||||
w.errorFile = errorWriter
|
||||
require.NoError(t, err)
|
||||
w.Request("path", "input", "http", nil)
|
||||
|
||||
require.Equal(t, `{"template":"path","input":"input","error":"none","type":"http"}`, traceWriter.String())
|
||||
require.Empty(t, errorWriter.String())
|
||||
})
|
||||
|
||||
t.Run("ErrorWithWrappedError", func(t *testing.T) {
|
||||
errorWriter := &testWriteCloser{}
|
||||
|
||||
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
|
||||
w.errorFile = errorWriter
|
||||
require.NoError(t, err)
|
||||
w.Request(
|
||||
"misconfiguration/tcpconfig.yaml",
|
||||
"https://example.com/tcpconfig.html",
|
||||
"http",
|
||||
fmt.Errorf("GET https://example.com/tcpconfig.html/tcpconfig.html giving up after 2 attempts: %w", errors.New("context deadline exceeded (Client.Timeout exceeded while awaiting headers)")),
|
||||
)
|
||||
|
||||
require.Equal(t, `{"template":"misconfiguration/tcpconfig.yaml","input":"https://example.com/tcpconfig.html","error":"context deadline exceeded (Client.Timeout exceeded while awaiting headers)","type":"http"}`, errorWriter.String())
|
||||
})
|
||||
}
|
||||
|
||||
type testWriteCloser struct {
|
||||
strings.Builder
|
||||
}
|
||||
|
||||
func (w testWriteCloser) Close() error {
|
||||
return nil
|
||||
}
|
||||
@ -5,6 +5,7 @@ import (
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"gopkg.in/yaml.v2"
|
||||
|
||||
@ -13,11 +14,15 @@ import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/cache"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/utils/stats"
|
||||
)
|
||||
|
||||
const mandatoryFieldMissingTemplate = "mandatory '%s' field is missing"
|
||||
const (
|
||||
mandatoryFieldMissingTemplate = "mandatory '%s' field is missing"
|
||||
invalidFieldFormatTemplate = "invalid field format for '%s' (allowed format is %s)"
|
||||
)
|
||||
|
||||
// LoadTemplate returns true if the template is valid and matches the filtering criteria.
|
||||
func LoadTemplate(templatePath string, tagFilter *filter.TagFilter, extraTags []string) (bool, error) {
|
||||
@ -30,12 +35,12 @@ func LoadTemplate(templatePath string, tagFilter *filter.TagFilter, extraTags []
|
||||
return false, nil
|
||||
}
|
||||
|
||||
templateInfo := template.Info
|
||||
if validationError := validateMandatoryInfoFields(&templateInfo); validationError != nil {
|
||||
if validationError := validateTemplateFields(template); validationError != nil {
|
||||
stats.Increment(SyntaxErrorStats)
|
||||
return false, validationError
|
||||
}
|
||||
|
||||
return isTemplateInfoMetadataMatch(tagFilter, &templateInfo, extraTags)
|
||||
return isTemplateInfoMetadataMatch(tagFilter, &template.Info, extraTags, template.Type())
|
||||
}
|
||||
|
||||
// LoadWorkflow returns true if the workflow is valid and matches the filtering criteria.
|
||||
@ -45,10 +50,8 @@ func LoadWorkflow(templatePath string) (bool, error) {
|
||||
return false, templateParseError
|
||||
}
|
||||
|
||||
templateInfo := template.Info
|
||||
|
||||
if len(template.Workflows) > 0 {
|
||||
if validationError := validateMandatoryInfoFields(&templateInfo); validationError != nil {
|
||||
if validationError := validateTemplateFields(template); validationError != nil {
|
||||
return false, validationError
|
||||
}
|
||||
return true, nil
|
||||
@ -57,12 +60,12 @@ func LoadWorkflow(templatePath string) (bool, error) {
|
||||
return false, nil
|
||||
}
|
||||
|
||||
func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *model.Info, extraTags []string) (bool, error) {
|
||||
func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *model.Info, extraTags []string, templateType types.ProtocolType) (bool, error) {
|
||||
templateTags := templateInfo.Tags.ToSlice()
|
||||
templateAuthors := templateInfo.Authors.ToSlice()
|
||||
templateSeverity := templateInfo.SeverityHolder.Severity
|
||||
|
||||
match, err := tagFilter.Match(templateTags, templateAuthors, templateSeverity, extraTags)
|
||||
match, err := tagFilter.Match(templateTags, templateAuthors, templateSeverity, extraTags, templateType)
|
||||
|
||||
if err == filter.ErrExcluded {
|
||||
return false, filter.ErrExcluded
|
||||
@ -71,18 +74,29 @@ func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *mode
|
||||
return match, err
|
||||
}
|
||||
|
||||
func validateMandatoryInfoFields(info *model.Info) error {
|
||||
if info == nil {
|
||||
return fmt.Errorf(mandatoryFieldMissingTemplate, "info")
|
||||
}
|
||||
func validateTemplateFields(template *templates.Template) error {
|
||||
info := template.Info
|
||||
|
||||
var errors []string
|
||||
|
||||
if utils.IsBlank(info.Name) {
|
||||
return fmt.Errorf(mandatoryFieldMissingTemplate, "name")
|
||||
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "name"))
|
||||
}
|
||||
|
||||
if info.Authors.IsEmpty() {
|
||||
return fmt.Errorf(mandatoryFieldMissingTemplate, "author")
|
||||
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "author"))
|
||||
}
|
||||
|
||||
if template.ID == "" {
|
||||
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "id"))
|
||||
} else if !templateIDRegexp.MatchString(template.ID) {
|
||||
errors = append(errors, fmt.Sprintf(invalidFieldFormatTemplate, "id", templateIDRegexp.String()))
|
||||
}
|
||||
|
||||
if len(errors) > 0 {
|
||||
return fmt.Errorf(strings.Join(errors, ", "))
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -90,11 +104,13 @@ var (
|
||||
parsedTemplatesCache *cache.Templates
|
||||
ShouldValidate bool
|
||||
fieldErrorRegexp = regexp.MustCompile(`not found in`)
|
||||
templateIDRegexp = regexp.MustCompile(`^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$`)
|
||||
)
|
||||
|
||||
const (
|
||||
SyntaxWarningStats = "syntax-warnings"
|
||||
SyntaxErrorStats = "syntax-errors"
|
||||
RuntimeWarningsStats = "runtime-warnings"
|
||||
)
|
||||
|
||||
func init() {
|
||||
@ -103,6 +119,7 @@ func init() {
|
||||
|
||||
stats.NewEntry(SyntaxWarningStats, "Found %d templates with syntax warning (use -validate flag for further examination)")
|
||||
stats.NewEntry(SyntaxErrorStats, "Found %d templates with syntax error (use -validate flag for further examination)")
|
||||
stats.NewEntry(RuntimeWarningsStats, "Found %d templates with runtime error (use -validate flag for further examination)")
|
||||
}
|
||||
|
||||
// ParseTemplate parses a template and returns a *templates.Template structure
|
||||
|
||||
111
v2/pkg/parsers/parser_test.go
Normal file
111
v2/pkg/parsers/parser_test.go
Normal file
@ -0,0 +1,111 @@
|
||||
package parsers
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"fmt"
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader/filter"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/stringslice"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
)
|
||||
|
||||
func TestLoadTemplate(t *testing.T) {
|
||||
origTemplatesCache := parsedTemplatesCache
|
||||
defer func() { parsedTemplatesCache = origTemplatesCache }()
|
||||
|
||||
tt := []struct {
|
||||
name string
|
||||
template *templates.Template
|
||||
templateErr error
|
||||
|
||||
expectedErr error
|
||||
}{
|
||||
{
|
||||
name: "valid",
|
||||
template: &templates.Template{
|
||||
ID: "CVE-2021-27330",
|
||||
Info: model.Info{
|
||||
Name: "Valid template",
|
||||
Authors: stringslice.StringSlice{Value: "Author"},
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
name: "emptyTemplate",
|
||||
template: &templates.Template{},
|
||||
expectedErr: errors.New("mandatory 'name' field is missing, mandatory 'author' field is missing, mandatory 'id' field is missing"),
|
||||
},
|
||||
{
|
||||
name: "emptyNameWithInvalidID",
|
||||
template: &templates.Template{
|
||||
ID: "invalid id",
|
||||
Info: model.Info{
|
||||
Authors: stringslice.StringSlice{Value: "Author"},
|
||||
},
|
||||
},
|
||||
expectedErr: errors.New("mandatory 'name' field is missing, invalid field format for 'id' (allowed format is ^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$)"),
|
||||
},
|
||||
}
|
||||
|
||||
for _, tc := range tt {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
parsedTemplatesCache.Store(tc.name, tc.template, tc.templateErr)
|
||||
|
||||
tagFilter := filter.New(&filter.Config{})
|
||||
success, err := LoadTemplate(tc.name, tagFilter, nil)
|
||||
if tc.expectedErr == nil {
|
||||
require.NoError(t, err)
|
||||
require.True(t, success)
|
||||
} else {
|
||||
require.Equal(t, tc.expectedErr, err)
|
||||
require.False(t, success)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
t.Run("invalidTemplateID", func(t *testing.T) {
|
||||
tt := []struct {
|
||||
id string
|
||||
success bool
|
||||
}{
|
||||
{id: "A-B-C", success: true},
|
||||
{id: "A-B-C-1", success: true},
|
||||
{id: "CVE_2021_27330", success: true},
|
||||
{id: "ABC DEF", success: false},
|
||||
{id: "_-__AAA_", success: false},
|
||||
{id: " CVE-2021-27330", success: false},
|
||||
{id: "CVE-2021-27330 ", success: false},
|
||||
{id: "CVE-2021-27330-", success: false},
|
||||
{id: "-CVE-2021-27330-", success: false},
|
||||
{id: "CVE-2021--27330", success: false},
|
||||
{id: "CVE-2021+27330", success: false},
|
||||
}
|
||||
for i, tc := range tt {
|
||||
name := fmt.Sprintf("regexp%d", i)
|
||||
t.Run(name, func(t *testing.T) {
|
||||
template := &templates.Template{
|
||||
ID: tc.id,
|
||||
Info: model.Info{
|
||||
Name: "Valid template",
|
||||
Authors: stringslice.StringSlice{Value: "Author"},
|
||||
},
|
||||
}
|
||||
parsedTemplatesCache.Store(name, template, nil)
|
||||
|
||||
tagFilter := filter.New(&filter.Config{})
|
||||
success, err := LoadTemplate(name, tagFilter, nil)
|
||||
if tc.success {
|
||||
require.NoError(t, err)
|
||||
require.True(t, success)
|
||||
} else {
|
||||
require.Equal(t, errors.New("invalid field format for 'id' (allowed format is ^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$)"), err)
|
||||
require.False(t, success)
|
||||
}
|
||||
})
|
||||
}
|
||||
})
|
||||
}
|
||||
@ -18,7 +18,7 @@ func NewLoader(options *protocols.ExecuterOptions) (model.WorkflowLoader, error)
|
||||
tagFilter := filter.New(&filter.Config{
|
||||
Tags: options.Options.Tags,
|
||||
ExcludeTags: options.Options.ExcludeTags,
|
||||
Authors: options.Options.Author,
|
||||
Authors: options.Options.Authors,
|
||||
Severities: options.Options.Severities,
|
||||
IncludeTags: options.Options.IncludeTags,
|
||||
})
|
||||
|
||||
@ -79,23 +79,6 @@ func newInternalResponse() *InternalResponse {
|
||||
}
|
||||
}
|
||||
|
||||
// Unused
|
||||
// func toInternalRequest(req *http.Request, target string, body []byte) *InternalRequest {
|
||||
// intReq := newInternalRquest()
|
||||
|
||||
// intReq.Target = target
|
||||
// intReq.HTTPMajor = req.ProtoMajor
|
||||
// intReq.HTTPMinor = req.ProtoMinor
|
||||
// for k, v := range req.Header {
|
||||
// intReq.Headers[k] = v
|
||||
// }
|
||||
// intReq.Headers = req.Header
|
||||
// intReq.Method = req.Method
|
||||
// intReq.Body = body
|
||||
|
||||
// return intReq
|
||||
// }
|
||||
|
||||
func toInternalResponse(resp *http.Response, body []byte) *InternalResponse {
|
||||
intResp := newInternalResponse()
|
||||
|
||||
@ -125,14 +108,3 @@ func fromInternalResponse(intResp *InternalResponse) *http.Response {
|
||||
Body: ioutil.NopCloser(bytes.NewReader(intResp.Body)),
|
||||
}
|
||||
}
|
||||
|
||||
// Unused
|
||||
// func fromInternalRequest(intReq *InternalRequest) *http.Request {
|
||||
// return &http.Request{
|
||||
// ProtoMinor: intReq.HTTPMinor,
|
||||
// ProtoMajor: intReq.HTTPMajor,
|
||||
// Header: intReq.Headers,
|
||||
// ContentLength: int64(len(intReq.Body)),
|
||||
// Body: ioutil.NopCloser(bytes.NewReader(intReq.Body)),
|
||||
// }
|
||||
// }
|
||||
|
||||
@ -42,13 +42,13 @@ func (pf *ProjectFile) Get(req []byte) (*http.Response, error) {
|
||||
return nil, fmt.Errorf("not found")
|
||||
}
|
||||
|
||||
var httprecord HTTPRecord
|
||||
httprecord.Response = newInternalResponse()
|
||||
if err := unmarshal(data, &httprecord); err != nil {
|
||||
var httpRecord HTTPRecord
|
||||
httpRecord.Response = newInternalResponse()
|
||||
if err := unmarshal(data, &httpRecord); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return fromInternalResponse(httprecord.Response), nil
|
||||
return fromInternalResponse(httpRecord.Response), nil
|
||||
}
|
||||
|
||||
func (pf *ProjectFile) Set(req []byte, resp *http.Response, data []byte) error {
|
||||
@ -57,10 +57,10 @@ func (pf *ProjectFile) Set(req []byte, resp *http.Response, data []byte) error {
|
||||
return err
|
||||
}
|
||||
|
||||
var httprecord HTTPRecord
|
||||
httprecord.Request = req
|
||||
httprecord.Response = toInternalResponse(resp, data)
|
||||
data, err = marshal(httprecord)
|
||||
var httpRecord HTTPRecord
|
||||
httpRecord.Request = req
|
||||
httpRecord.Response = toInternalResponse(resp, data)
|
||||
data, err = marshal(httpRecord)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
@ -1,49 +0,0 @@
|
||||
package clusterer
|
||||
|
||||
import (
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
|
||||
)
|
||||
|
||||
// Cluster clusters a list of templates into a lesser number if possible based
|
||||
// on the similarity between the sent requests.
|
||||
//
|
||||
// If the attributes match, multiple requests can be clustered into a single
|
||||
// request which saves time and network resources during execution.
|
||||
func Cluster(list map[string]*templates.Template) [][]*templates.Template {
|
||||
final := [][]*templates.Template{}
|
||||
|
||||
// Each protocol that can be clustered should be handled here.
|
||||
for key, template := range list {
|
||||
// We only cluster http requests as of now.
|
||||
// Take care of requests that can't be clustered first.
|
||||
if len(template.RequestsHTTP) == 0 {
|
||||
delete(list, key)
|
||||
final = append(final, []*templates.Template{template})
|
||||
continue
|
||||
}
|
||||
|
||||
delete(list, key) // delete element first so it's not found later.
|
||||
// Find any/all similar matching request that is identical to
|
||||
// this one and cluster them together for http protocol only.
|
||||
if len(template.RequestsHTTP) == 1 {
|
||||
cluster := []*templates.Template{}
|
||||
|
||||
for otherKey, other := range list {
|
||||
if len(other.RequestsHTTP) == 0 {
|
||||
continue
|
||||
}
|
||||
if template.RequestsHTTP[0].CanCluster(other.RequestsHTTP[0]) {
|
||||
delete(list, otherKey)
|
||||
cluster = append(cluster, other)
|
||||
}
|
||||
}
|
||||
if len(cluster) > 0 {
|
||||
cluster = append(cluster, template)
|
||||
final = append(final, cluster)
|
||||
continue
|
||||
}
|
||||
}
|
||||
final = append(final, []*templates.Template{template})
|
||||
}
|
||||
return final
|
||||
}
|
||||
@ -6,6 +6,7 @@ import (
|
||||
"github.com/projectdiscovery/gologger"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/output"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/writer"
|
||||
)
|
||||
|
||||
// Executer executes a group of requests for a protocol
|
||||
@ -47,8 +48,6 @@ func (e *Executer) Execute(input string) (bool, error) {
|
||||
dynamicValues := make(map[string]interface{})
|
||||
previous := make(map[string]interface{})
|
||||
for _, req := range e.requests {
|
||||
req := req
|
||||
|
||||
err := req.ExecuteWithResults(input, dynamicValues, previous, func(event *output.InternalWrappedEvent) {
|
||||
ID := req.GetID()
|
||||
if ID != "" {
|
||||
@ -61,18 +60,17 @@ func (e *Executer) Execute(input string) (bool, error) {
|
||||
builder.Reset()
|
||||
}
|
||||
}
|
||||
if event.OperatorsResult == nil {
|
||||
return
|
||||
}
|
||||
for _, result := range event.Results {
|
||||
if e.options.IssuesClient != nil {
|
||||
if err := e.options.IssuesClient.CreateIssue(result); err != nil {
|
||||
gologger.Warning().Msgf("Could not create issue on tracker: %s", err)
|
||||
}
|
||||
// If no results were found, and also interactsh is not being used
|
||||
// in that case we can skip it, otherwise we've to show failure in
|
||||
// case of matcher-status flag.
|
||||
if event.OperatorsResult == nil && !event.UsesInteractsh {
|
||||
if err := e.options.Output.WriteFailure(event.InternalEvent); err != nil {
|
||||
gologger.Warning().Msgf("Could not write failure event to output: %s\n", err)
|
||||
}
|
||||
} else {
|
||||
if writer.WriteResult(event, e.options.Output, e.options.Progress, e.options.IssuesClient) {
|
||||
results = true
|
||||
_ = e.options.Output.Write(result)
|
||||
e.options.Progress.IncrementMatched()
|
||||
}
|
||||
}
|
||||
})
|
||||
if err != nil {
|
||||
@ -83,6 +81,10 @@ func (e *Executer) Execute(input string) (bool, error) {
|
||||
}
|
||||
gologger.Warning().Msgf("[%s] Could not execute request for %s: %s\n", e.options.TemplateID, input, err)
|
||||
}
|
||||
// If a match was found and stop at first match is set, break out of the loop and return
|
||||
if results && (e.options.StopAtFirstMatch || e.options.Options.StopAtFirstMatch) {
|
||||
break
|
||||
}
|
||||
}
|
||||
return results, nil
|
||||
}
|
||||
@ -91,6 +93,7 @@ func (e *Executer) Execute(input string) (bool, error) {
|
||||
func (e *Executer) ExecuteWithResults(input string, callback protocols.OutputEventCallback) error {
|
||||
dynamicValues := make(map[string]interface{})
|
||||
previous := make(map[string]interface{})
|
||||
var results bool
|
||||
|
||||
for _, req := range e.requests {
|
||||
req := req
|
||||
@ -110,6 +113,7 @@ func (e *Executer) ExecuteWithResults(input string, callback protocols.OutputEve
|
||||
if event.OperatorsResult == nil {
|
||||
return
|
||||
}
|
||||
results = true
|
||||
callback(event)
|
||||
})
|
||||
if err != nil {
|
||||
@ -120,6 +124,10 @@ func (e *Executer) ExecuteWithResults(input string, callback protocols.OutputEve
|
||||
}
|
||||
gologger.Warning().Msgf("[%s] Could not execute request for %s: %s\n", e.options.TemplateID, input, err)
|
||||
}
|
||||
// If a match was found and stop at first match is set, break out of the loop and return
|
||||
if results && (e.options.StopAtFirstMatch || e.options.Options.StopAtFirstMatch) {
|
||||
break
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
@ -4,12 +4,13 @@ import (
|
||||
"regexp"
|
||||
|
||||
"github.com/Knetic/govaluate"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/operators/common/dsl"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/generators"
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/replacer"
|
||||
)
|
||||
|
||||
var templateExpressionRegex = regexp.MustCompile(`(?m)\{\{[^}]+\}\}["'\)\}]*`)
|
||||
var templateExpressionRegex = regexp.MustCompile(`(?m){{[^}]+}}["')}]*`)
|
||||
|
||||
// Evaluate checks if the match contains a dynamic variable, for each
|
||||
// found one we will check if it's an expression and can
|
||||
|
||||
@ -6,11 +6,12 @@ import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
var unresolvedVariablesRegex = regexp.MustCompile(`(?:%7[B|b]|\{){2}([^}]+)(?:%7[D|d]|\}){2}["'\)\}]*`)
|
||||
var unresolvedVariablesRegex = regexp.MustCompile(`(?:%7[B|b]|{){2}([^}]+)(?:%7[D|d]|}){2}["')}]*`)
|
||||
|
||||
// ContainsUnresolvedVariables returns an error with variable names if the passed
|
||||
// input contains unresolved {{<pattern-here>}} variables.
|
||||
func ContainsUnresolvedVariables(data string) error {
|
||||
func ContainsUnresolvedVariables(items ...string) error {
|
||||
for _, data := range items {
|
||||
matches := unresolvedVariablesRegex.FindAllStringSubmatch(data, -1)
|
||||
if len(matches) == 0 {
|
||||
return nil
|
||||
@ -29,9 +30,13 @@ func ContainsUnresolvedVariables(data string) error {
|
||||
}
|
||||
errorMessage := errorString.String()
|
||||
return errors.New(errorMessage)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func ContainsVariablesWithNames(data string, names map[string]interface{}) error {
|
||||
func ContainsVariablesWithNames(names map[string]interface{}, items ...string) error {
|
||||
for _, data := range items {
|
||||
matches := unresolvedVariablesRegex.FindAllStringSubmatch(data, -1)
|
||||
if len(matches) == 0 {
|
||||
return nil
|
||||
@ -53,4 +58,7 @@ func ContainsVariablesWithNames(data string, names map[string]interface{}) error
|
||||
}
|
||||
errorMessage := errorString.String()
|
||||
return errors.New(errorMessage)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
97
v2/pkg/protocols/common/generators/attack_types.go
Normal file
97
v2/pkg/protocols/common/generators/attack_types.go
Normal file
@ -0,0 +1,97 @@
|
||||
package generators
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/jsonschema"
|
||||
"github.com/pkg/errors"
|
||||
)
|
||||
|
||||
// AttackType is the type of attack for payloads
|
||||
type AttackType int
|
||||
|
||||
// Supported values for the AttackType
|
||||
// name:AttackType
|
||||
const (
|
||||
// name:batteringram
|
||||
BatteringRamAttack AttackType = iota + 1
|
||||
// name:pitchfork
|
||||
PitchForkAttack
|
||||
// name:clusterbomb
|
||||
ClusterBombAttack
|
||||
limit
|
||||
)
|
||||
|
||||
// attackTypeMappings is a table for conversion of attack type from string.
|
||||
var attackTypeMappings = map[AttackType]string{
|
||||
BatteringRamAttack: "batteringram",
|
||||
PitchForkAttack: "pitchfork",
|
||||
ClusterBombAttack: "clusterbomb",
|
||||
}
|
||||
|
||||
func GetSupportedAttackTypes() []AttackType {
|
||||
var result []AttackType
|
||||
for index := AttackType(1); index < limit; index++ {
|
||||
result = append(result, index)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func toAttackType(valueToMap string) (AttackType, error) {
|
||||
normalizedValue := normalizeValue(valueToMap)
|
||||
for key, currentValue := range attackTypeMappings {
|
||||
if normalizedValue == currentValue {
|
||||
return key, nil
|
||||
}
|
||||
}
|
||||
return -1, errors.New("invalid attack type: " + valueToMap)
|
||||
}
|
||||
|
||||
func normalizeValue(value string) string {
|
||||
return strings.TrimSpace(strings.ToLower(value))
|
||||
}
|
||||
|
||||
func (t AttackType) String() string {
|
||||
return attackTypeMappings[t]
|
||||
}
|
||||
|
||||
// AttackTypeHolder is used to hold internal type of the protocol
|
||||
type AttackTypeHolder struct {
|
||||
Value AttackType `mapping:"true"`
|
||||
}
|
||||
|
||||
func (holder AttackTypeHolder) JSONSchemaType() *jsonschema.Type {
|
||||
gotType := &jsonschema.Type{
|
||||
Type: "string",
|
||||
Title: "type of the attack",
|
||||
Description: "Type of the attack",
|
||||
}
|
||||
for _, types := range GetSupportedAttackTypes() {
|
||||
gotType.Enum = append(gotType.Enum, types.String())
|
||||
}
|
||||
return gotType
|
||||
}
|
||||
|
||||
func (holder *AttackTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
|
||||
var marshalledTypes string
|
||||
if err := unmarshal(&marshalledTypes); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
computedType, err := toAttackType(marshalledTypes)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
holder.Value = computedType
|
||||
return nil
|
||||
}
|
||||
|
||||
func (holder *AttackTypeHolder) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(holder.Value.String())
|
||||
}
|
||||
|
||||
func (holder AttackTypeHolder) MarshalYAML() (interface{}, error) {
|
||||
return holder.Value.String(), nil
|
||||
}
|
||||
@ -2,49 +2,54 @@
|
||||
|
||||
package generators
|
||||
|
||||
import "github.com/pkg/errors"
|
||||
import (
|
||||
"github.com/pkg/errors"
|
||||
|
||||
// Generator is the generator struct for generating payloads
|
||||
type Generator struct {
|
||||
Type Type
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
|
||||
)
|
||||
|
||||
// PayloadGenerator is the generator struct for generating payloads
|
||||
type PayloadGenerator struct {
|
||||
Type AttackType
|
||||
payloads map[string][]string
|
||||
}
|
||||
|
||||
// Type is type of attack
|
||||
type Type int
|
||||
|
||||
const (
|
||||
// Batteringram replaces same payload into all of the defined payload positions at once.
|
||||
BatteringRam Type = iota + 1
|
||||
// PitchFork replaces variables with positional value from multiple wordlists
|
||||
PitchFork
|
||||
// ClusterBomb replaces variables with all possible combinations of values
|
||||
ClusterBomb
|
||||
)
|
||||
|
||||
// StringToType is a table for conversion of attack type from string.
|
||||
var StringToType = map[string]Type{
|
||||
"batteringram": BatteringRam,
|
||||
"pitchfork": PitchFork,
|
||||
"clusterbomb": ClusterBomb,
|
||||
}
|
||||
|
||||
// New creates a new generator structure for payload generation
|
||||
func New(payloads map[string]interface{}, payloadType Type, templatePath string) (*Generator, error) {
|
||||
generator := &Generator{}
|
||||
func New(payloads map[string]interface{}, attackType AttackType, templatePath string, catalog *catalog.Catalog) (*PayloadGenerator, error) {
|
||||
if attackType.String() == "" {
|
||||
attackType = BatteringRamAttack
|
||||
}
|
||||
|
||||
// Resolve payload paths if they are files.
|
||||
payloadsFinal := make(map[string]interface{})
|
||||
for name, payload := range payloads {
|
||||
payloadsFinal[name] = payload
|
||||
}
|
||||
for name, payload := range payloads {
|
||||
payloadStr, ok := payload.(string)
|
||||
if ok {
|
||||
final, resolveErr := catalog.ResolvePath(payloadStr, templatePath)
|
||||
if resolveErr != nil {
|
||||
return nil, errors.Wrap(resolveErr, "could not read payload file")
|
||||
}
|
||||
payloadsFinal[name] = final
|
||||
}
|
||||
}
|
||||
|
||||
generator := &PayloadGenerator{}
|
||||
if err := generator.validate(payloads, templatePath); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
compiled, err := loadPayloads(payloads)
|
||||
compiled, err := loadPayloads(payloadsFinal)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
generator.Type = payloadType
|
||||
generator.Type = attackType
|
||||
generator.payloads = compiled
|
||||
|
||||
// Validate the batteringram payload set
|
||||
if payloadType == BatteringRam {
|
||||
if attackType == BatteringRamAttack {
|
||||
if len(payloads) != 1 {
|
||||
return nil, errors.New("batteringram must have single payload set")
|
||||
}
|
||||
@ -54,7 +59,7 @@ func New(payloads map[string]interface{}, payloadType Type, templatePath string)
|
||||
|
||||
// Iterator is a single instance of an iterator for a generator structure
|
||||
type Iterator struct {
|
||||
Type Type
|
||||
Type AttackType
|
||||
position int
|
||||
msbIterator int
|
||||
total int
|
||||
@ -62,7 +67,7 @@ type Iterator struct {
|
||||
}
|
||||
|
||||
// NewIterator creates a new iterator for the payloads generator
|
||||
func (g *Generator) NewIterator() *Iterator {
|
||||
func (g *PayloadGenerator) NewIterator() *Iterator {
|
||||
var payloads []*payloadIterator
|
||||
|
||||
for name, values := range g.payloads {
|
||||
@ -95,18 +100,18 @@ func (i *Iterator) Remaining() int {
|
||||
func (i *Iterator) Total() int {
|
||||
count := 0
|
||||
switch i.Type {
|
||||
case BatteringRam:
|
||||
case BatteringRamAttack:
|
||||
for _, p := range i.payloads {
|
||||
count += len(p.values)
|
||||
}
|
||||
case PitchFork:
|
||||
case PitchForkAttack:
|
||||
count = len(i.payloads[0].values)
|
||||
for _, p := range i.payloads {
|
||||
if count > len(p.values) {
|
||||
count = len(p.values)
|
||||
}
|
||||
}
|
||||
case ClusterBomb:
|
||||
case ClusterBombAttack:
|
||||
count = 1
|
||||
for _, p := range i.payloads {
|
||||
count *= len(p.values)
|
||||
@ -118,11 +123,11 @@ func (i *Iterator) Total() int {
|
||||
// Value returns the next value for an iterator
|
||||
func (i *Iterator) Value() (map[string]interface{}, bool) {
|
||||
switch i.Type {
|
||||
case BatteringRam:
|
||||
case BatteringRamAttack:
|
||||
return i.batteringRamValue()
|
||||
case PitchFork:
|
||||
case PitchForkAttack:
|
||||
return i.pitchforkValue()
|
||||
case ClusterBomb:
|
||||
case ClusterBombAttack:
|
||||
return i.clusterbombValue()
|
||||
default:
|
||||
return i.batteringRamValue()
|
||||
@ -179,7 +184,7 @@ func (i *Iterator) clusterbombValue() (map[string]interface{}, bool) {
|
||||
signalNext = false
|
||||
}
|
||||
if !p.next() {
|
||||
// No more inputs in this inputprovider
|
||||
// No more inputs in this input provider
|
||||
if index == i.msbIterator {
|
||||
// Reset all previous wordlists and increment the msb counter
|
||||
i.msbIterator++
|
||||
|
||||
@ -4,12 +4,15 @@ import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/require"
|
||||
|
||||
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
|
||||
)
|
||||
|
||||
func TestBatteringRamGenerator(t *testing.T) {
|
||||
usernames := []string{"admin", "password"}
|
||||
|
||||
generator, err := New(map[string]interface{}{"username": usernames}, BatteringRam, "")
|
||||
catalogInstance := catalog.New("")
|
||||
generator, err := New(map[string]interface{}{"username": usernames}, BatteringRamAttack, "", catalogInstance)
|
||||
require.Nil(t, err, "could not create generator")
|
||||
|
||||
iterator := generator.NewIterator()
|
||||
@ -28,7 +31,8 @@ func TestPitchforkGenerator(t *testing.T) {
|
||||
usernames := []string{"admin", "token"}
|
||||
passwords := []string{"password1", "password2", "password3"}
|
||||
|
||||
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, PitchFork, "")
|
||||
catalogInstance := catalog.New("")
|
||||
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, PitchForkAttack, "", catalogInstance)
|
||||
require.Nil(t, err, "could not create generator")
|
||||
|
||||
iterator := generator.NewIterator()
|
||||
@ -49,7 +53,8 @@ func TestClusterbombGenerator(t *testing.T) {
|
||||
usernames := []string{"admin"}
|
||||
passwords := []string{"admin", "password", "token"}
|
||||
|
||||
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, ClusterBomb, "")
|
||||
catalogInstance := catalog.New("")
|
||||
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, ClusterBombAttack, "", catalogInstance)
|
||||
require.Nil(t, err, "could not create generator")
|
||||
|
||||
iterator := generator.NewIterator()
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user