In the case of machine learning in particular, it is not part of the training data set.
Hyperparameters can also be considered in domains outside of machine learning however, e.g. the step size in partial differential equation solver is entirely independent from the problem itself and could be considered a hyperparamter. One difference from machine learning however is that step size hyperparameters in numerical analysis are clearly better if smaller at a higher computational cost. In machine learning however, there is often an optimum somewhere, beyond which overfitting becomes excessive.
Articles by others on the same topic
A hyperparameter is a configuration or parameter that is set before the training of a machine learning model begins and is not learned from the data during training. Essentially, these parameters influence the training process itself and can affect the model's performance. Hyperparameters differ from model parameters, which are the values adjusted by the learning algorithm during the training process, such as weights in a neural network.