### Pytorch regression _2.1_ [WorldHappinessReport.csv]

April 30, 2020

300420201044 https://github.com/jcjohnson/pytorch-examples#pytorch-custom-nn-modules In [1]: import torch I’m starting a GPU graphics card (which I don’t have) Odpalam karte graficzną GPU (której nie mam) In [2]: device = […]

### Pytorch regression _1.1_[WorldHappinessReport]

April 29, 2020

290420201753. Tworzenie małych prototypów o pełnej zdolności bojowej Cele: 1. podstawić prawdziwy plik danych 2. przeliczyć zapamiętać model odpalić model 3. zweryfikowac model czy liczy […]

### Review of models based on gradient falling: XGBoost, LightGBM, CatBoost

April 24, 2020

240120202201 In [67]: # Classification Assessment def Classification_Assessment(model ,Xtrain, ytrain, Xtest, ytest): import numpy as np import matplotlib.pyplot as plt from sklearn import metrics from sklearn.metrics […]

### Kilka prostych przykładów z programowanie objektowe w Python

April 24, 2020

240420202112 Koty ¶ W 1]: class kolor_kota ( obiekt ): def __init__ ( self , imię , kolor ): self . imię = imię self […]

### Perfect Plots Bubble Plot

April 24, 2020

Perfect Plots: Bubble Plot In [1]: import pandas as pd import matplotlib.pyplot as plt import numpy as np   Autos¶ Source of data: https://datahub.io/machine-learning/autos In [2]: […]

April 24, 2020

April 24, 2020

### Interpretation of SHAP charts for the Titanic case (Feature Selection Techniques)

April 9, 2020

090420202257 https://github.com/slundberg/shap In [1]: import pandas as pd df = pd.read_csv(‘/home/wojciech/Pulpit/1/tit_train.csv’, na_values=”-1″) df.head(2) Out[1]:   Unnamed: 0 PassengerId Survived Pclass Name Sex Age SibSp Parch […]

### Homemade loop to search for the best functions for the regression model (Feature Selection Techniques)

April 9, 2020

090420201150 In [1]: import pandas as pd df = pd.read_csv(‘/home/wojciech/Pulpit/1/tit_train.csv’, na_values=”-1″) df.head(2) Out[1]: Unnamed: 0 PassengerId Survived Pclass Name Sex Age SibSp Parch Ticket Fare […]

### How to calculate the probability of survival of the Titanic catastrophe_080420201050

April 9, 2020

080420201050 practical use: predict_proba In [1]: import numpy as np import pandas as pd from sklearn.model_selection import train_test_split from sklearn.ensemble import RandomForestClassifier In [2]: from catboost.datasets import […]

April 8, 2020

### CatBoost Step 1. CatBoostClassifier (cat_features)

April 3, 2020

030420200928 In [1]: ## colorful prints def black(text): print(’33[30m’, text, ’33[0m’, sep=”) def red(text): print(’33[31m’, text, ’33[0m’, sep=”) def green(text): print(’33[32m’, text, ’33[0m’, sep=”) def […]

### Feature Selection Techniques [categorical result] – Step Forward Selection

April 1, 2020

010420201017 Forward selection is an iterative method in which we start with no function in the model. In each iteration, we add a function that […]