pytorch - THE DATA SCIENCE LIBRARY https://sigmaquality.pl/category/models/pytorch/ Wojciech Moszczyński Sun, 03 May 2020 11:10:45 +0000 pl-PL hourly 1 https://wordpress.org/?v=6.8.3 https://sigmaquality.pl/wp-content/uploads/2019/02/cropped-ryba-32x32.png pytorch - THE DATA SCIENCE LIBRARY https://sigmaquality.pl/category/models/pytorch/ 32 32 Pytorch regression 3.7 [BikeSharing.csv] https://sigmaquality.pl/models/pytorch/pytorch-regression-3-7-bikesharing-csv-030520201303/ Sun, 03 May 2020 11:05:34 +0000 http://sigmaquality.pl/pytorch-regression-3-7-bikesharing-csv-030520201303/ 030520201303 Work on diagnostic systems. The tank prototype can really be checked in combat conditions! https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset In [1]: import torch I’m starting a GPU graphics card [...]

Artykuł Pytorch regression 3.7 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
030520201303

Work on diagnostic systems.

The tank prototype can really be checked in combat conditions!

obraz.png
https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset

In [1]:
import torch

I’m starting a GPU graphics card (which I don’t have)

Odpalam karte graficzną GPU (której nie mam)

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/3/BikeSharing.csv')
print(df.shape)
df.head(3)
(17379, 17)
Out[3]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

cnt: count of total rental bikes including both casual and registered

I fill all holes with values out of range

Wypełniam wszystkie dziury wartościami z poza zakresu

In [4]:
import matplotlib.pyplot as plt
import seaborn as sns

plt.figure(figsize=(10,6))
CORREL =df.corr()
sns.heatmap(CORREL, annot=True, cbar=False, cmap="coolwarm")
plt.title('Macierz korelacji ze zmienną wynikową y', fontsize=20)
Out[4]:
Text(0.5, 1, 'Macierz korelacji ze zmienną wynikową y')
In [5]:
import matplotlib.pyplot as plt

plt.figure(figsize=(10,6))
CORREL['cnt'].plot(kind='barh', color='red')
plt.title('Korelacja ze zmienną wynikową', fontsize=20)
plt.xlabel('Poziom korelacji')
plt.ylabel('Zmienne nezależne ciągłe')
Out[5]:
Text(0, 0.5, 'Zmienne nezależne ciągłe')

Variables: 'registered’, 'casual’ are also results only shown differently, therefore they must be removed from the data.

Zmienne: 'registered’,’casual’ są to też wyniki tylko inazej pokazane dlatego trzeba je usunąć z danych.

In [6]:
a,b = df.shape     #<- ile mamy kolumn
b

print('NUMBER OF EMPTY RECORDS vs. FULL RECORDS')
print('----------------------------------------')
for i in range(1,b):
    i = df.columns[i]
    r = df[i].isnull().sum()
    h = df[i].count()
    pr = (r/h)*100
   
    if r > 0:
        print(i,"--------",r,"--------",h,"--------",pr) 
NUMBER OF EMPTY RECORDS vs. FULL RECORDS
----------------------------------------
In [7]:
import seaborn as sns

sns.heatmap(df.isnull(),yticklabels=False,cbar=False,cmap='viridis')
Out[7]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f9e0102a110>
In [8]:
#del df['Unnamed: 15']
#del df['Unnamed: 16']

df = df.dropna(how='any') # jednak je kasuje te dziury

# df.fillna(-777, inplace=True)
df.isnull().sum()
Out[8]:
instant       0
dteday        0
season        0
yr            0
mnth          0
hr            0
holiday       0
weekday       0
workingday    0
weathersit    0
temp          0
atemp         0
hum           0
windspeed     0
casual        0
registered    0
cnt           0
dtype: int64
In [9]:
print(df.dtypes)
df.head(3)
instant         int64
dteday         object
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
dtype: object
Out[9]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

to_datetime

In [10]:
df['dteday'] =  pd.to_datetime(df['dteday'])
df['weekday'] = df.dteday.dt.weekday
df['month'] =df.dteday.dt.month
df['weekofyear'] =df.dteday.dt.weekofyear 
In [11]:
del df['dteday']
In [12]:
print(df.dtypes)
df.head(3)
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
Out[12]:
instant season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt month weekofyear
0 1 1 0 1 0 0 5 0 1 0.24 0.2879 0.81 0.0 3 13 16 1 52
1 2 1 0 1 1 0 5 0 1 0.22 0.2727 0.80 0.0 8 32 40 1 52
2 3 1 0 1 2 0 5 0 1 0.22 0.2727 0.80 0.0 5 27 32 1 52

Encodes text values

Koduje wartości tekstowe

In [13]:
import numpy as np

a,b = df.shape     #<- ile mamy kolumn
b

print('DISCRETE FUNCTIONS CODED')
print('------------------------')
for i in range(1,b):
    i = df.columns[i]
    f = df[i].dtypes
    if f == np.object:
        print(i,"---",f)   
    
        if f == np.object:
        
            df[i] = pd.Categorical(df[i]).codes
        
            continue
DISCRETE FUNCTIONS CODED
------------------------

df[’Time’] = pd.Categorical(df[’Time’]).codes
df[’Time’] = df[’Time’].astype(int)

In [14]:
df.dtypes
Out[14]:
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
In [15]:
df.columns
Out[15]:
Index(['instant', 'season', 'yr', 'mnth', 'hr', 'holiday', 'weekday',
       'workingday', 'weathersit', 'temp', 'atemp', 'hum', 'windspeed',
       'casual', 'registered', 'cnt', 'month', 'weekofyear'],
      dtype='object')

I’m cutting out an iron test reserve

Wycinam żelazną rezerwę testową

Wycinam 0.5

In [16]:
R,C =df.shape
F = R*0.005
L = R - F
L
Out[16]:
17292.105
In [17]:
df5 = df[df.index>=L]
df2 = df[df.index<L]
print('Zbiór super testowy df5:',df5.shape)
print('df2:                    ',df2.shape) 
Zbiór super testowy df5: (86, 18)
df2:                     (17293, 18)

I specify what is X and what is y

Określam co jest X a co y

In [18]:
X = df2.drop(['cnt','registered','casual'],1)
y = df2['cnt']
In [19]:
X_SuperT = df5.drop(['cnt','registered','casual'],1)
y_SuperT = df5['cnt']

Scaling (normalization) of the X value

X should never be too big. Ideally, it should be in the range [-1, 1]. If this is not the case, normalize the input.

Skalowanie (normalizacja) wartości X

X nigdy nie powinien być zbyt duży. Idealnie powinien być w zakresie [-1, 1]. Jeśli tak nie jest, należy znormalizować dane wejściowe.

In [20]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X = sc.fit_transform(X)

print(np.round(X.std(), decimals=2), np.round(X.mean(), decimals=2))
1.0 0.0
In [21]:
y.value_counts()
Out[21]:
5      260
6      235
4      231
3      221
2      207
      ... 
725      1
709      1
661      1
629      1
887      1
Name: cnt, Length: 869, dtype: int64
In [22]:
y = (y / 100)  # max test score is 100
#print(y.head(3))
print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))
1.82 1.9

Creates random input and output

Tworzy losowe dane wejściowe i wyjściowe

In [23]:
import numpy as np

#X = X.values       #- jak była normalizacja to to nie działa
X = torch.tensor(X)
print(X[:3])
tensor([[-1.7320, -1.3663, -1.0002, -1.6087, -1.6694, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.3437, -1.1028,  0.9456, -1.5557, -1.6087,  1.7030],
        [-1.7318, -1.3663, -1.0002, -1.6087, -1.5247, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.4477, -1.1914,  0.8938, -1.5557, -1.6087,  1.7030],
        [-1.7316, -1.3663, -1.0002, -1.6087, -1.3801, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.4477, -1.1914,  0.8938, -1.5557, -1.6087,  1.7030]],
       dtype=torch.float64)
In [24]:
X = X.type(torch.FloatTensor)
print(X[:3])
tensor([[-1.7320, -1.3663, -1.0002, -1.6087, -1.6694, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.3437, -1.1028,  0.9456, -1.5557, -1.6087,  1.7030],
        [-1.7318, -1.3663, -1.0002, -1.6087, -1.5247, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.4477, -1.1914,  0.8938, -1.5557, -1.6087,  1.7030],
        [-1.7315, -1.3663, -1.0002, -1.6087, -1.3801, -0.1726,  0.9965, -1.4710,
         -0.6633, -1.4477, -1.1914,  0.8938, -1.5557, -1.6087,  1.7030]])
In [25]:
y = y.values   # tworzymy macierz numpy - jak była normalizacja to to nie działa
In [26]:
y = torch.tensor(y)
print(y[:3])
tensor([0.1600, 0.4000, 0.3200], dtype=torch.float64)

TRanspends the resulting vector to become a column

TRansponuje wektor wynikowy aby stał się kolumną

y = y.view(y.shape[0],1)
y[:5]

In [27]:
y = y.type(torch.FloatTensor)
In [28]:
print('X:',X.shape)
print('y:',y.shape)
X: torch.Size([17293, 15])
y: torch.Size([17293])

Dodanie jednego wymiaru do wektora wynikowego

In [29]:
y = y.view(y.shape[0],1)
y.shape
Out[29]:
torch.Size([17293, 1])

Podział na zbiór testowy i zbiór treningowy

In [30]:
a,b = X.shape
a

total_records = a
test_records = int(a * .2)

X_train = X[:total_records-test_records]
X_test = X[total_records-test_records:total_records]

y_train = y[:total_records-test_records]
y_test = y[total_records-test_records:total_records]
In [31]:
print('X_train: ',X_train.shape)
print('X_test:  ',X_test.shape)
print('----------------------------------------------------')
print('y_train: ',y_train.shape)
print('y_test:  ',y_test.shape)
X_train:  torch.Size([13835, 15])
X_test:   torch.Size([3458, 15])
----------------------------------------------------
y_train:  torch.Size([13835, 1])
y_test:   torch.Size([3458, 1])

Defining the neural network

Programowanie torch.nn.Module
In [32]:
class Net(torch.nn.Module):
    def __init__(self, n_feature, n_hidden, n_output):
        super(Net, self).__init__()
        self.hidden = torch.nn.Linear(n_feature, n_hidden)   # hidden layer
        self.predict = torch.nn.Linear(n_hidden, n_output)   # output layer

    def forward(self, x):
        x = F.relu(self.hidden(x))      # activation function for hidden layer
        x = self.predict(x)             # linear output
        return x
Definicja krztałtu sieci
In [33]:
N, D_in = X.shape
N, D_out = y.shape

H = 100
device = torch.device('cpu')
In [34]:
net = torch.nn.Sequential(
        torch.nn.Linear(D_in,  H),
        torch.nn.LeakyReLU(),
        torch.nn.Linear(H, H),
        torch.nn.LeakyReLU(),
        torch.nn.Linear(H, D_out),
    ).to(device)  
In [35]:
net(X_train)
Out[35]:
tensor([[-0.1001],
        [-0.0887],
        [-0.0803],
        ...,
        [-0.0541],
        [-0.0833],
        [-0.0890]], grad_fn=<AddmmBackward>)

Алгоритм оптимизации:

Optymalizator

lr: Speed of learning -> The speed at which our model updates the weights in the cells each time backward propagation is carried out

lr: Szybkość uczenia się -> Szybkość, z jaką nasz model aktualizuje wagi w komórkach za każdym razem, gdy przeprowadzana jest wsteczna propagacja

In [36]:
#optimizer = torch.optim.SGD(net.parameters(), lr=0.01, momentum=0, dampening=0, weight_decay=0, nesterov=False) #-2.401
#optimizer = torch.optim.SGD(net.parameters(), lr=0.1) #-4.086
optimizer = torch.optim.Adam(net.parameters(), lr=0.01) #-5.298
#optimizer = torch.optim.Adamax(net.parameters(), lr=0.01) #-6.610
#optimizer = torch.optim.ASGD(net.parameters(), lr=0.01, lambd=0.0001, alpha=0.15, t0=000000.0) #-2.315
#optimizer = torch.optim.LBFGS(net.parameters(), lr=0.01, max_iter=20, max_eval=None, tolerance_grad=1e-05, tolerance_change=1e-09, history_size=100, line_search_fn=None)
#optimizer = torch.optim.RMSprop(net.parameters(), lr=0.01, alpha=0.99, eps=1e-08) #-5.152
#optimizer = torch.optim.Rprop(net.parameters(), lr=0.01, etas=(0.5, 1.2), step_sizes=(1e-06, 50))  #R2:-7.388

Определение функции потерь

to jest R2 dla regresji

In [37]:
loss_func = torch.nn.MSELoss()

Definiowanie procesu nauki i nauka

In [38]:
inputs = X_train                          #1. deklarujemy x i y do nauki
outputs = y_train
for i in range(2000):                          #2. pętla 1050 powtórzeń (epok)
   prediction = net(inputs)
   loss = loss_func(prediction, outputs) 
   optimizer.zero_grad()
   loss.backward()        
   optimizer.step()       

   if i 
      print(i, loss.item())     # <=# wartości y, a funkcja straty zwraca Tensor zawierający stratę.
0 5.6923956871032715
200 0.27917081117630005
400 0.12776944041252136
600 0.10168859362602234
800 0.11591815203428268
1000 0.11488667875528336
1200 0.08809500187635422
1400 0.08998128771781921
1600 0.07167605310678482
1800 0.07469654083251953

There are many potential reasons. Most likely exploding gradients. The two things to try first:

  • Normalize the inputs
  • Lower the learning rate

Istnieje wiele potencjalnych przyczyn. Najprawdopodobniej wybuchające gradienty. Dwie rzeczy do wypróbowania w pierwszej kolejności:

  • – Normalizuj wejścia
  • – Obniż tempo uczenia msię

import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel(’Loss’)
plt.xlabel(’epoch’)
plt.show

Forecast based on the model

  • substitute the same equations that were in the model
  • The following loss result shows the last model sequence
  • Loss shows how much the model is wrong (loss = sum of error squares) after the last learning sequence

Prognoza na podstawie modelu

  • podstawiamy te same równania, które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
    obraz.png
In [39]:
with torch.no_grad():
    y_pred = net(X_test)  
    loss = (y_pred - y_test).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 3405.68505859

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [40]:
y_pred[:5]
Out[40]:
tensor([[3.9657],
        [4.3590],
        [4.1216],
        [3.7958],
        [3.2563]])

We save the whole model

Zapisujemy cały model

In [41]:
torch.save(net,'/home/wojciech/Pulpit/7/byk15.pb')

We play the whole model

Odtwarzamy cały model

In [42]:
KOT = torch.load('/home/wojciech/Pulpit/7/byk15.pb')
KOT.eval()
Out[42]:
Sequential(
  (0): Linear(in_features=15, out_features=100, bias=True)
  (1): LeakyReLU(negative_slope=0.01)
  (2): Linear(in_features=100, out_features=100, bias=True)
  (3): LeakyReLU(negative_slope=0.01)
  (4): Linear(in_features=100, out_features=1, bias=True)
)

By substituting other independent variables, you can get a vector of output variables

We choose a random record from the tensor

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord z tensora

In [43]:
y_pred = y_pred*10
foka = y_pred.cpu().detach().numpy()
df11 = pd.DataFrame(foka)
df11.columns = ['y_pred']
df11=np.round(df11.y_pred)
df11.head(3)
Out[43]:
0    40.0
1    44.0
2    41.0
Name: y_pred, dtype: float32
In [44]:
y_test = y_test*10
foka = y_test.cpu().detach().numpy()
df_t = pd.DataFrame(foka)
df_t.columns = ['y']
df_t.head(3)
Out[44]:
y
0 45.000000
1 49.200001
2 49.000000
In [45]:
NOWA = pd.merge(df_t,df11, how='inner', left_index=True, right_index=True)
NOWA.head(3)
Out[45]:
y y_pred
0 45.000000 40.0
1 49.200001 44.0
2 49.000000 41.0
In [46]:
NOWA.to_csv('/home/wojciech/Pulpit/7/NOWA.csv')
In [47]:
fig, ax = plt.subplots( figsize=(16, 2))
for ewa in ['y', 'y_pred']:
    ax.plot(NOWA, label=ewa)
    
ax.set_xlim(1340, 1500)
#ax.legend()
ax.set_ylabel('Parameter')
ax.set_title('COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
Out[47]:
Text(0.5, 1.0, 'COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
In [48]:
## marginesy
plt.subplots_adjust( left = None , bottom = None , right = None , top = None , wspace = None , hspace = None )
plt.figure(figsize=(16,5))
ax = plt.subplot(1, 2, 1)
NOWA.plot.kde(ax=ax, legend=True, title='Histogram: y vs. y_pred')
NOWA.plot.hist(density=True,bins=40, ax=ax, alpha=0.3)
ax.set_title("Dystributions")

ax = plt.subplot(1, 2, 2)
sns.boxplot(data = NOWA)
plt.xticks(rotation=-90)
ax.set_title("Boxes")


sns.lmplot(data=NOWA, x='y', y='y_pred')
Out[48]:
<seaborn.axisgrid.FacetGrid at 0x7f9df115b2d0>
<Figure size 432x288 with 0 Axes>

Regression_Assessment

In [49]:
## Robi ocenę tylko dla jednej zmiennej

def Regression_Assessment(y, y_pred):
    
    from sklearn.metrics import r2_score 
    import scipy.stats as stats
    from statsmodels.graphics.gofplots import qqplot
    from matplotlib import pyplot
       
    print('-----two methods--------------')
    SS_Residual = sum((y-y_pred)**2)       
    SS_Total = sum((y-np.mean(y))**2)     
    r_squared = 1 - (float(SS_Residual))/SS_Total
    adjusted_r_squared = 1 - (1-r_squared)*(len(y)-1)/(len(y)-X.shape[1]-1)
    print('r2_score:           
    #print('adjusted_r_squared: 
    #print('----r2_score------secound-method--------')  
    print('r2_score:           
    print()
    print('-------------------------------')
    MAE = (abs(y-y_pred)).mean()
    print('Mean absolute error     MAE:  
    RMSE = np.sqrt(((y-y_pred)**2).mean())
    print('Root mean squared error RMSE: 
    pt = (100*(y-y_pred))/y
    MAPE = (abs(pt)).mean()
    print('Mean absolute error     MAPE: 
    print('-------------------------------')
    
    stat,pvalue0 = stats.ttest_1samp(a=(y-y_pred),popmean=0.0)

    if pvalue0 > 0.01:
        print('t-test H0: the sum of the model residuals is zero')
        print('OKAY! Model remains do not differ from zero - pvalue:
    else:     
        print('Bad - Model remains DIFFERENT FROM ZERO - pvalue:
    print('--------------------------------------------------------------------------------------------') 
  
       
    stat,pvalue2_1 = stats.shapiro(y)
    stat,pvalue2_2 = stats.shapiro(y_pred)

    if pvalue2_1 > 0.01:
        #print('Shapiro-Wilk H0: y have normal distribution?--------------------------------')
        print('OK Shapiro-Wolf! y have normal distribution - pvalue:
    else:     
        print('Bad Shapiro-Wilk - y NO NORMAL DISTRIBUTION - pvalue:
        print('--------------------------------------------------------------------------------------------')
    if pvalue2_2 > 0.01:
        #print('Shapiro-Wilk: y_pred have a normal distribution?--')
        print('OK Shapiro-Wolf! y_pred has a normal distribution - pvalue:
    else:     
        print('Bad Shapiro-Wilk y_pred NO NORMAL DISTRIBUTION - pvalue:
    
    qqplot(y, line='s')
    pyplot.show()

    qqplot(y_pred, line='s')
    pyplot.show()
       
    print('--------------------------------------------------------------------------------------------')
        
    stat,pvalue3 = stats.kruskal(y_pred,y)
    stat,pvalue4 = stats.f_oneway(y_pred,y)

    if pvalue2_1 < 0.01 or pvalue2_2 < 0.01:
        print('Шапиро-Вилк: Переменные не имеют нормального распределения! Не могу сделать анализ ANOV')
     
        if pvalue3 > 0.01:
            print('Kruskal-Wallis NON-PARAMETRIC TEST: whether empirical forecast and observations. have equal means?')
            print('OKAY! Kruskal-Wallis H0: forecast and observations empir. have equal means - pvalue:
        else:     
            print('Bad - Kruskal-Wallis: forecast and observations empir. DO NOT HAVE EQUAL Averages - pvalue:
    
    else:

        if pvalue4 > 0.01:
            print('F-test (ANOVA): whether empirical forecast and observations. have equal means?--------------------------------')
            print('OKAY! forecast and observations empir. have equal means - pvalue:
        else:     
            print('Bad - forecast and observations empir. DO NOT HAVE EQUAL Averages - pvalue:
    print('--------------------------------------------------------------------------------------------')
In [50]:
y = NOWA['y']
y_pred = NOWA['y_pred']

Regression_Assessment(y, y_pred)
-----two methods--------------
r2_score:           0.799
r2_score:           0.799

-------------------------------
Mean absolute error     MAE:  6.95 
Root mean squared error RMSE: 9.93 
Mean absolute error     MAPE: 91.06 
-------------------------------
Bad - Model remains DIFFERENT FROM ZERO - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Bad Shapiro-Wilk - y NO NORMAL DISTRIBUTION - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Bad Shapiro-Wilk y_pred NO NORMAL DISTRIBUTION - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Шапиро-Вилк: Переменные не имеют нормального распределения! Не могу сделать анализ ANOV
Bad - Kruskal-Wallis: forecast and observations empir. DO NOT HAVE EQUAL Averages - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------

Танк Супер Тест в боевых условиях!

obraz.png

In [51]:
print(X_SuperT.shape)
X_SuperT.head(3)
(86, 15)
Out[51]:
instant season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed month weekofyear
17293 17294 1 1 12 10 0 4 1 2 0.26 0.2424 0.56 0.2537 12 52
17294 17295 1 1 12 11 0 4 1 2 0.28 0.2727 0.52 0.2239 12 52
17295 17296 1 1 12 12 0 4 1 2 0.30 0.3030 0.49 0.1343 12 52
In [52]:
y_SuperT.head(3)
Out[52]:
17293    162
17294    178
17295    222
Name: cnt, dtype: int64
In [53]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X_SuperT = sc.fit_transform(X_SuperT)

print(np.round(X_SuperT.std(), decimals=2), np.round(X_SuperT.mean(), decimals=2))
0.82 -0.0
In [ ]:
 
In [54]:
X_SuperT = torch.tensor(X_SuperT)
X_SuperT = X_SuperT.type(torch.FloatTensor)
print(X_SuperT[:3])
tensor([[-1.7120,  0.0000,  0.0000,  0.0000, -0.3405,  0.0000,  0.1161,  1.1239,
          0.6165,  0.3527,  0.1062, -0.2904,  0.3681,  0.0000,  0.6222],
        [-1.6717,  0.0000,  0.0000,  0.0000, -0.1934,  0.0000,  0.1161,  1.1239,
          0.6165,  0.8419,  0.9404, -0.5800,  0.1717,  0.0000,  0.6222],
        [-1.6315,  0.0000,  0.0000,  0.0000, -0.0462,  0.0000,  0.1161,  1.1239,
          0.6165,  1.3312,  1.7746, -0.7971, -0.4189,  0.0000,  0.6222]])
In [55]:
y_SuperT = (y_SuperT / 100)  # max test score is 100
#print(y.head(3))
print(np.round(y_SuperT.std(), decimals=2), np.round(y_SuperT.mean(), decimals=2))
0.77 0.97
In [56]:
y_SuperT = y_SuperT.values
y_SuperT = torch.tensor(y_SuperT)
y_SuperT = y_SuperT.view(y_SuperT.shape[0],1)
y_SuperT.shape
Out[56]:
torch.Size([86, 1])
In [57]:
print('X_SuperT:',X_SuperT.shape)
print('y_SuperT:',y_SuperT.shape)
X_SuperT: torch.Size([86, 15])
y_SuperT: torch.Size([86, 1])
In [ ]:
 
In [58]:
with torch.no_grad():
    y_predST = net(X_SuperT)  
    loss = (y_predST - y_SuperT).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 232.59281707
In [ ]:
 
In [59]:
y_predST = y_predST*100
foka = y_predST.cpu().detach().numpy()
df11 = pd.DataFrame(foka)
df11.columns = ['y_predST']
df11=np.round(df11.y_predST)
df11.head(3)
Out[59]:
0    157.0
1    132.0
2    105.0
Name: y_predST, dtype: float32
In [60]:
y_SuperT = y_SuperT*100
y_SuperT = np.round(y_SuperT)
foka = y_SuperT.cpu().detach().numpy()
df_t = pd.DataFrame(foka)
df_t.columns = ['y_ST']
df_t.head(3)
Out[60]:
y_ST
0 162.0
1 178.0
2 222.0
In [61]:
Super_NOWA = pd.merge(df_t,df11, how='inner', left_index=True, right_index=True)
Super_NOWA.head(3)
Out[61]:
y_ST y_predST
0 162.0 157.0
1 178.0 132.0
2 222.0 105.0
In [62]:
fig, ax = plt.subplots( figsize=(16, 2))
for ewa in ['y_ST', 'y_predST']:
    ax.plot(Super_NOWA, label=ewa)
    
#ax.set_xlim(1340, 1500)
#ax.legend()
ax.set_ylabel('Parameter')
ax.set_title('COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
Out[62]:
Text(0.5, 1.0, 'COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
In [63]:
## marginesy
plt.subplots_adjust( left = None , bottom = None , right = None , top = None , wspace = None , hspace = None )
plt.figure(figsize=(16,5))
ax = plt.subplot(1, 2, 1)
Super_NOWA.plot.kde(ax=ax, legend=True, title='Histogram: y vs. y_pred')
Super_NOWA.plot.hist(density=True,bins=40, ax=ax, alpha=0.3)
ax.set_title("Dystributions")

ax = plt.subplot(1, 2, 2)
sns.boxplot(data = Super_NOWA)
plt.xticks(rotation=-90)
ax.set_title("Boxes")


sns.lmplot(data=Super_NOWA, x='y_ST', y='y_predST')
Out[63]:
<seaborn.axisgrid.FacetGrid at 0x7f9df0d58750>
<Figure size 432x288 with 0 Axes>
In [64]:
y = Super_NOWA['y_ST']
y_pred = Super_NOWA['y_predST']

Regression_Assessment(y, y_pred)
-----two methods--------------
r2_score:           -3.593
r2_score:           -3.593

-------------------------------
Mean absolute error     MAE:  114.26 
Root mean squared error RMSE: 164.43 
Mean absolute error     MAPE: 192.44 
-------------------------------
Bad - Model remains DIFFERENT FROM ZERO - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Bad Shapiro-Wilk - y NO NORMAL DISTRIBUTION - pvalue: 0.0001 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Bad Shapiro-Wilk y_pred NO NORMAL DISTRIBUTION - pvalue: 0.0000 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------
Шапиро-Вилк: Переменные не имеют нормального распределения! Не могу сделать анализ ANOV
Bad - Kruskal-Wallis: forecast and observations empir. DO NOT HAVE EQUAL Averages - pvalue: 0.0044 <0.01 (We reject H0)
--------------------------------------------------------------------------------------------

Вышло плохо – методы устранения явления перенапряжения должны быть реализованы!

obraz.png

Mean absolute error MAE i RMSE

obraz.png

Percentage errors MAPE

obraz.png

obraz.png

obraz.png
obraz.png

Artykuł Pytorch regression 3.7 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
Pytorch regression 3.1 [BikeSharing.csv] https://sigmaquality.pl/models/pytorch/pytorch-regression-3-1-bikesharing-csv-020520201956/ Sat, 02 May 2020 18:04:26 +0000 http://sigmaquality.pl/pytorch-regression-3-1-bikesharing-csv-020520201956/ 020520201956 Work on diagnostic systems. We have replaced the engine in our tank with a more universal one https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset In [1]: import torch I’m starting a [...]

Artykuł Pytorch regression 3.1 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
020520201956

Work on diagnostic systems.

We have replaced the engine in our tank with a more universal one
obraz.png

https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset

In [1]:
import torch

I’m starting a GPU graphics card (which I don’t have)

Odpalam karte graficzną GPU (której nie mam)

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU

Output variables (3):

  • SLUMP (cm)
  • FLOW (cm)
  • 28-day Compressive Strength (Mpa)
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/3/BikeSharing.csv')
print(df.shape)
df.head(3)
(17379, 17)
Out[3]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

cnt: count of total rental bikes including both casual and registered

I fill all holes with values out of range

Wypełniam wszystkie dziury wartościami z poza zakresu

In [4]:
import matplotlib.pyplot as plt
import seaborn as sns

plt.figure(figsize=(10,6))
CORREL =df.corr()
sns.heatmap(CORREL, annot=True, cbar=False, cmap="coolwarm")
plt.title('Macierz korelacji ze zmienną wynikową y', fontsize=20)
Out[4]:
Text(0.5, 1, 'Macierz korelacji ze zmienną wynikową y')
In [5]:
import matplotlib.pyplot as plt

plt.figure(figsize=(10,6))
CORREL['cnt'].plot(kind='barh', color='red')
plt.title('Korelacja ze zmienną wynikową', fontsize=20)
plt.xlabel('Poziom korelacji')
plt.ylabel('Zmienne nezależne ciągłe')
Out[5]:
Text(0, 0.5, 'Zmienne nezależne ciągłe')

Zmienne: 'registered’,’casual’ są to też wyniki tylko inazej pokazane dlatego trzeba je usunąć z danych

In [6]:
a,b = df.shape     #<- ile mamy kolumn
b

print('NUMBER OF EMPTY RECORDS vs. FULL RECORDS')
print('----------------------------------------')
for i in range(1,b):
    i = df.columns[i]
    r = df[i].isnull().sum()
    h = df[i].count()
    pr = (r/h)*100
   
    if r > 0:
        print(i,"--------",r,"--------",h,"--------",pr) 
NUMBER OF EMPTY RECORDS vs. FULL RECORDS
----------------------------------------
In [7]:
import seaborn as sns

sns.heatmap(df.isnull(),yticklabels=False,cbar=False,cmap='viridis')
Out[7]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fb3439c2650>
In [8]:
#del df['Unnamed: 15']
#del df['Unnamed: 16']

df = df.dropna(how='any') # jednak je kasuje te dziury

# df.fillna(-777, inplace=True)
df.isnull().sum()
Out[8]:
instant       0
dteday        0
season        0
yr            0
mnth          0
hr            0
holiday       0
weekday       0
workingday    0
weathersit    0
temp          0
atemp         0
hum           0
windspeed     0
casual        0
registered    0
cnt           0
dtype: int64
In [9]:
print(df.dtypes)
df.head(3)
instant         int64
dteday         object
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
dtype: object
Out[9]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

to_datetime

In [10]:
df['dteday'] =  pd.to_datetime(df['dteday'])
df['weekday'] = df.dteday.dt.weekday
df['month'] =df.dteday.dt.month
df['weekofyear'] =df.dteday.dt.weekofyear 
In [11]:
del df['dteday']
In [12]:
print(df.dtypes)
df.head(3)
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
Out[12]:
instant season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt month weekofyear
0 1 1 0 1 0 0 5 0 1 0.24 0.2879 0.81 0.0 3 13 16 1 52
1 2 1 0 1 1 0 5 0 1 0.22 0.2727 0.80 0.0 8 32 40 1 52
2 3 1 0 1 2 0 5 0 1 0.22 0.2727 0.80 0.0 5 27 32 1 52

Encodes text values

Koduje wartości tekstowe

In [13]:
import numpy as np

a,b = df.shape     #<- ile mamy kolumn
b

print('DISCRETE FUNCTIONS CODED')
print('------------------------')
for i in range(1,b):
    i = df.columns[i]
    f = df[i].dtypes
    if f == np.object:
        print(i,"---",f)   
    
        if f == np.object:
        
            df[i] = pd.Categorical(df[i]).codes
        
            continue
DISCRETE FUNCTIONS CODED
------------------------

df[’Time’] = pd.Categorical(df[’Time’]).codes
df[’Time’] = df[’Time’].astype(int)

In [14]:
df.dtypes
Out[14]:
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
In [15]:
df.columns
Out[15]:
Index(['instant', 'season', 'yr', 'mnth', 'hr', 'holiday', 'weekday',
       'workingday', 'weathersit', 'temp', 'atemp', 'hum', 'windspeed',
       'casual', 'registered', 'cnt', 'month', 'weekofyear'],
      dtype='object')

I specify what is X and what is y

Określam co jest X a co y

In [16]:
X = df.drop(['cnt','registered','casual'],1)
y = df['cnt']

Scaling (normalization) of the X value

X should never be too big. Ideally, it should be in the range [-1, 1]. If this is not the case, normalize the input.

Skalowanie (normalizacja) wartości X

X nigdy nie powinien być zbyt duży. Idealnie powinien być w zakresie [-1, 1]. Jeśli tak nie jest, należy znormalizować dane wejściowe.

In [17]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X = sc.fit_transform(X)

print(np.round(X.std(), decimals=2), np.round(X.mean(), decimals=2))
1.0 -0.0
In [18]:
y.value_counts()
Out[18]:
5      260
6      236
4      231
3      224
2      208
      ... 
725      1
709      1
661      1
629      1
887      1
Name: cnt, Length: 869, dtype: int64
In [19]:
y = (y / 100)  # max test score is 100
#print(y.head(3))
print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))
1.81 1.89

Creates random input and output

Tworzy losowe dane wejściowe i wyjściowe

In [20]:
import numpy as np

#X = X.values       #- jak była normalizacja to to nie działa
X = torch.tensor(X)
print(X[:3])
tensor([[-1.7320, -1.3566, -1.0051, -1.6104, -1.6700, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.3346, -1.0933,  0.9474, -1.5539, -1.6104,  1.6913],
        [-1.7318, -1.3566, -1.0051, -1.6104, -1.5254, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913],
        [-1.7316, -1.3566, -1.0051, -1.6104, -1.3807, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913]],
       dtype=torch.float64)
In [21]:
X = X.type(torch.FloatTensor)
print(X[:3])
tensor([[-1.7320, -1.3566, -1.0051, -1.6104, -1.6700, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.3346, -1.0933,  0.9474, -1.5539, -1.6104,  1.6913],
        [-1.7318, -1.3566, -1.0051, -1.6104, -1.5254, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913],
        [-1.7316, -1.3566, -1.0051, -1.6104, -1.3807, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913]])
In [22]:
y = y.values   # tworzymy macierz numpy - jak była normalizacja to to nie działa
In [23]:
y = torch.tensor(y)
print(y[:3])
tensor([0.1600, 0.4000, 0.3200], dtype=torch.float64)

TRanspends the resulting vector to become a column

TRansponuje wektor wynikowy aby stał się kolumną

y = y.view(y.shape[0],1)
y[:5]

In [24]:
y = y.type(torch.FloatTensor)

from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
y = sc.fit_transform(y)

print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))

In [25]:
print('X:',X.shape)
print('y:',y.shape)
X: torch.Size([17379, 15])
y: torch.Size([17379])

Dodanie jednego wymiaru do wektora wynikowego

In [26]:
y = y.view(y.shape[0],1)
y.shape
Out[26]:
torch.Size([17379, 1])

Podział na zbiór testowy i zbiór treningowy

In [27]:
a,b = X.shape
a

total_records = a
test_records = int(a * .2)

X_train = X[:total_records-test_records]
X_test = X[total_records-test_records:total_records]

y_train = y[:total_records-test_records]
y_test = y[total_records-test_records:total_records]
In [28]:
print('X_train: ',X_train.shape)
print('X_test:  ',X_test.shape)
print('----------------------------------------------------')
print('y_train: ',y_train.shape)
print('y_test:  ',y_test.shape)
X_train:  torch.Size([13904, 15])
X_test:   torch.Size([3475, 15])
----------------------------------------------------
y_train:  torch.Size([13904, 1])
y_test:   torch.Size([3475, 1])

Definiowanie sieci neuronowej

Programowanie torch.nn.Module
In [29]:
class Net(torch.nn.Module):
    def __init__(self, n_feature, n_hidden, n_output):
        super(Net, self).__init__()
        self.hidden = torch.nn.Linear(n_feature, n_hidden)   # hidden layer
        self.predict = torch.nn.Linear(n_hidden, n_output)   # output layer

    def forward(self, x):
        x = F.relu(self.hidden(x))      # activation function for hidden layer
        x = self.predict(x)             # linear output
        return x
Definicja krztałtu sieci
In [30]:
N, D_in = X.shape
N, D_out = y.shape

H = 100
device = torch.device('cpu')

model = torch.nn.Sequential(
torch.nn.Linear(D_in, H),
torch.nn.ReLU(),
torch.nn.Linear(H, D_out),
).to(device)

In [31]:
net = torch.nn.Sequential(
        torch.nn.Linear(D_in,  H),
        torch.nn.LeakyReLU(),
        torch.nn.Linear(H, H),
        torch.nn.LeakyReLU(),
        torch.nn.Linear(H, D_out),
    ).to(device)  
In [32]:
net(X_train)
Out[32]:
tensor([[-0.0874],
        [-0.0926],
        [-0.0942],
        ...,
        [ 0.0814],
        [ 0.0851],
        [ 0.1024]], grad_fn=<AddmmBackward>)
Algorytm optymalizacji:
Optymalizator

lr: Szybkość uczenia się -> Szybkość, z jaką nasz model aktualizuje wagi w komórkach za każdym razem, gdy przeprowadzana jest wsteczna propagacja

In [33]:
#optimizer = torch.optim.SGD(net.parameters(), lr=0.01, momentum=0, dampening=0, weight_decay=0, nesterov=False)
#optimizer = torch.optim.SGD(net.parameters(), lr=0.1)
optimizer = torch.optim.Adam(net.parameters(), lr=0.01)
#optimizer = torch.optim.Adamax(net.parameters(), lr=0.01)
#optimizer = torch.optim.ASGD(net.parameters(), lr=0.01, lambd=0.0001, alpha=0.15, t0=000000.0)
#optimizer = torch.optim.LBFGS(net.parameters(), lr=0.01, max_iter=20, max_eval=None, tolerance_grad=1e-05, tolerance_change=1e-09, history_size=100, line_search_fn=None)
#optimizer = torch.optim.RMSprop(net.parameters(), lr=0.01, alpha=0.99, eps=1e-08)
#optimizer = torch.optim.Rprop(net.parameters(), lr=0.01, etas=(0.5, 1.2), step_sizes=(1e-06, 50))  #R2:0.77
Definicja funkcji straty

to jest R2 dla regresji

In [34]:
loss_func = torch.nn.MSELoss()

Definiowanie procesu nauki i nauka

In [35]:
inputs = X_train                          #1. deklarujemy x i y do nauki
outputs = y_train
for i in range(2000):                          #2. pętla 1050 powtórzeń (epok)
   prediction = net(inputs)
   loss = loss_func(prediction, outputs) 
   optimizer.zero_grad()
   loss.backward()        
   optimizer.step()       

   if i 
      print(i, loss.item())     # <=# wartości y, a funkcja straty zwraca Tensor zawierający stratę.
0 5.711716175079346
200 0.14712589979171753
400 0.10049592703580856
600 0.08884098380804062
800 0.08310769498348236
1000 0.07949385792016983
1200 0.07695646584033966
1400 0.07494604587554932
1600 0.073524110019207
1800 0.07221145927906036

There are many potential reasons. Most likely exploding gradients. The two things to try first:

  • Normalize the inputs
  • Lower the learning rate

Istnieje wiele potencjalnych przyczyn. Najprawdopodobniej wybuchające gradienty. Dwie rzeczy do wypróbowania w pierwszej kolejności:

  • – Normalizuj wejścia
  • – Obniż tempo uczenia msię

import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel(’Loss’)
plt.xlabel(’epoch’)
plt.show

Forecast based on the model

  • substitute the same equations that were in the model
  • The following loss result shows the last model sequence
  • Loss shows how much the model is wrong (loss = sum of error squares) after the last learning sequence

Prognoza na podstawie modelu

  • podstawiamy te same równania, które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
    obraz.png
In [36]:
with torch.no_grad():
    y_pred = net(X_test)  
    loss = (y_pred - y_test).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 3875.08569336

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [37]:
y_pred[:5]
Out[37]:
tensor([[2.9172],
        [3.5864],
        [2.9354],
        [4.5793],
        [7.5099]])

We save the whole model

Zapisujemy cały model

In [38]:
torch.save(net,'/home/wojciech/Pulpit/7/byk15.pb')

We play the whole model

Odtwarzamy cały model

In [39]:
KOT = torch.load('/home/wojciech/Pulpit/7/byk15.pb')
KOT.eval()
Out[39]:
Sequential(
  (0): Linear(in_features=15, out_features=100, bias=True)
  (1): LeakyReLU(negative_slope=0.01)
  (2): Linear(in_features=100, out_features=100, bias=True)
  (3): LeakyReLU(negative_slope=0.01)
  (4): Linear(in_features=100, out_features=1, bias=True)
)

By substituting other independent variables, you can get a vector of output variables

We choose a random record from the tensor

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord z tensora

obraz.png

In [40]:
y_pred = y_pred*10
foka = y_pred.cpu().detach().numpy()
df11 = pd.DataFrame(foka)
df11.columns = ['y_pred']
df11=np.round(df11.y_pred)
df11.head(3)
Out[40]:
0    29.0
1    36.0
2    29.0
Name: y_pred, dtype: float32
In [41]:
y_test = y_test*10
foka = y_test.cpu().detach().numpy()
df_t = pd.DataFrame(foka)
df_t.columns = ['y']
df_t.head(3)
Out[41]:
y
0 25.299999
1 26.099998
2 30.599998
In [42]:
NOWA = pd.merge(df_t,df11, how='inner', left_index=True, right_index=True)
NOWA.head(3)
Out[42]:
y y_pred
0 25.299999 29.0
1 26.099998 36.0
2 30.599998 29.0
In [43]:
NOWA.to_csv('/home/wojciech/Pulpit/7/NOWA.csv')
In [44]:
fig, ax = plt.subplots( figsize=(16, 2))
for ewa in ['y', 'y_pred']:
    ax.plot(NOWA, label=ewa)
    
ax.set_xlim(1340, 1500)
#ax.legend()
ax.set_ylabel('Parameter')
ax.set_title('COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
Out[44]:
Text(0.5, 1.0, 'COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
In [45]:
## marginesy
plt.subplots_adjust( left = None , bottom = None , right = None , top = None , wspace = None , hspace = None )
plt.figure(figsize=(16,5))
ax = plt.subplot(1, 2, 1)
NOWA.plot.kde(ax=ax, legend=True, title='Histogram: y vs. y_pred')
NOWA.plot.hist(density=True,bins=40, ax=ax, alpha=0.3)
ax.set_title("Dystributions")

ax = plt.subplot(1, 2, 2)
sns.boxplot(data = NOWA)
plt.xticks(rotation=-90)
ax.set_title("Boxes")


sns.lmplot(data=NOWA, x='y', y='y_pred')
Out[45]:
<seaborn.axisgrid.FacetGrid at 0x7fb3400a0990>
<Figure size 432x288 with 0 Axes>

Regression_Assessment

In [46]:
## Robi ocenę tylko dla jednej zmiennej

def Regression_Assessment(y, y_pred):
    
    from sklearn.metrics import r2_score 
    import scipy.stats as stats
    from statsmodels.graphics.gofplots import qqplot
    from matplotlib import pyplot
       
    print('-----two methods--------------')
    SS_Residual = sum((y-y_pred)**2)       
    SS_Total = sum((y-np.mean(y))**2)     
    r_squared = 1 - (float(SS_Residual))/SS_Total
    adjusted_r_squared = 1 - (1-r_squared)*(len(y)-1)/(len(y)-X.shape[1]-1)
    print('r2_score:           
    #print('adjusted_r_squared: 
    #print('----r2_score------secound-method--------')  
    print('r2_score:           
    print()
    print('-------------------------------')
    MAE = (abs(y-y_pred)).mean()
    print('Mean absolute error     MAE:  
    RMSE = np.sqrt(((y-y_pred)**2).mean())
    print('Root mean squared error RMSE: 
    pt = (100*(y-y_pred))/y
    MAPE = (abs(pt)).mean()
    print('Mean absolute error     MAPE: 
    print('-------------------------------')
    
    stat,pvalue0 = stats.ttest_1samp(a=(y-y_pred),popmean=0.0)

    if pvalue0 > 0.01:
        print('t-test H0:suma reszt modelu wynosi zero--')
        print('OK! Resztki modelu nie różnią się od zera - pvalue: 
    else:     
        print('Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 
    print('--------------------------------------------------------------------------------------------') 
  
       
    stat,pvalue2_1 = stats.shapiro(y)
    stat,pvalue2_2 = stats.shapiro(y_pred)

    if pvalue2_1 > 0.01:
        #print('Shapiro-Wilk H0: y maj rozkład normalny?--------------------------------')
        print('OK Shapiro-Wilk! y maja rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 
        print('--------------------------------------------------------------------------------------------')
    if pvalue2_2 > 0.01:
        #print('Shapiro-Wilk: y_pred maj rozkład normalny?--')
        print('OK Shapiro-Wilk! y_pred ma rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 
    
    qqplot(y, line='s')
    pyplot.show()

    qqplot(y_pred, line='s')
    pyplot.show()
       
    print('--------------------------------------------------------------------------------------------')
        
    stat,pvalue3 = stats.kruskal(y_pred,y)
    stat,pvalue4 = stats.f_oneway(y_pred,y)

    if pvalue2_1 < 0.01 or pvalue2_2 < 0.01:
        print('Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA')
     
        if pvalue3 > 0.01:
            print('Kruskal-Wallis NON-PARAMETRIC TEST: czy prognoza i obserwacje empir. mają równe średnie?')
            print('OK! Kruskal-Wallis H0: prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    
    else:

        if pvalue4 > 0.01:
            print('F-test (ANOVA): czy prognoza i obserwacje empir. mają równe średnie?--------------------------------')
            print('OK! prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    print('--------------------------------------------------------------------------------------------')
In [47]:
y = NOWA['y']
y_pred = NOWA['y_pred']

Regression_Assessment(y, y_pred)
-----two methods--------------
r2_score:           0.770
r2_score:           0.770

-------------------------------
Mean absolute error     MAE:  7.37 
Root mean squared error RMSE: 10.57 
Mean absolute error     MAPE: 143.06 
-------------------------------
Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA
Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 0.0087 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------

Mean absolute error MAE i RMSE

obraz.png

Percentage errors MAPE

obraz.png

obraz.png

obraz.png

Artykuł Pytorch regression 3.1 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
Pytorch regression 2.3 [BikeSharing.csv] https://sigmaquality.pl/models/pytorch/pytorch-regression-2-3-bikesharing-csv-020520201513/ Sat, 02 May 2020 13:20:38 +0000 http://sigmaquality.pl/pytorch-regression-2-3-bikesharing-csv-020520201513/ 020520201513 Work on diagnostic systems. There is no progress in tank construction without diagnostics. https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset In [1]: import torch I’m starting a GPU graphics card (which [...]

Artykuł Pytorch regression 2.3 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
020520201513

Work on diagnostic systems.
There is no progress in tank construction without diagnostics.
obraz.png

https://archive.ics.uci.edu/ml/datasets/Bike+Sharing+Dataset

In [1]:
import torch

I’m starting a GPU graphics card (which I don’t have)

Odpalam karte graficzną GPU (której nie mam)

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU

Output variables (3):

  • SLUMP (cm)
  • FLOW (cm)
  • 28-day Compressive Strength (Mpa)
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/3/BikeSharing.csv')
print(df.shape)
df.head(3)
(17379, 17)
Out[3]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

cnt: count of total rental bikes including both casual and registered

I fill all holes with values out of range

Wypełniam wszystkie dziury wartościami z poza zakresu

In [4]:
import matplotlib.pyplot as plt
import seaborn as sns

plt.figure(figsize=(10,6))
CORREL =df.corr()
sns.heatmap(CORREL, annot=True, cbar=False, cmap="coolwarm")
plt.title('Macierz korelacji ze zmienną wynikową y', fontsize=20)
Out[4]:
Text(0.5, 1, 'Macierz korelacji ze zmienną wynikową y')
In [5]:
import matplotlib.pyplot as plt

plt.figure(figsize=(10,6))
CORREL['cnt'].plot(kind='barh', color='red')
plt.title('Korelacja ze zmienną wynikową', fontsize=20)
plt.xlabel('Poziom korelacji')
plt.ylabel('Zmienne nezależne ciągłe')
Out[5]:
Text(0, 0.5, 'Zmienne nezależne ciągłe')

Zmienne: 'registered’,’casual’ są to też wyniki tylko inazej pokazane dlatego trzeba je usunąć z danych

In [6]:
a,b = df.shape     #<- ile mamy kolumn
b

print('NUMBER OF EMPTY RECORDS vs. FULL RECORDS')
print('----------------------------------------')
for i in range(1,b):
    i = df.columns[i]
    r = df[i].isnull().sum()
    h = df[i].count()
    pr = (r/h)*100
   
    if r > 0:
        print(i,"--------",r,"--------",h,"--------",pr) 
NUMBER OF EMPTY RECORDS vs. FULL RECORDS
----------------------------------------
In [7]:
import seaborn as sns

sns.heatmap(df.isnull(),yticklabels=False,cbar=False,cmap='viridis')
Out[7]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fc07f183190>
In [8]:
#del df['Unnamed: 15']
#del df['Unnamed: 16']

df = df.dropna(how='any') # jednak je kasuje te dziury

# df.fillna(-777, inplace=True)
df.isnull().sum()
Out[8]:
instant       0
dteday        0
season        0
yr            0
mnth          0
hr            0
holiday       0
weekday       0
workingday    0
weathersit    0
temp          0
atemp         0
hum           0
windspeed     0
casual        0
registered    0
cnt           0
dtype: int64
In [9]:
print(df.dtypes)
df.head(3)
instant         int64
dteday         object
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
dtype: object
Out[9]:
instant dteday season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt
0 1 2011-01-01 1 0 1 0 0 6 0 1 0.24 0.2879 0.81 0.0 3 13 16
1 2 2011-01-01 1 0 1 1 0 6 0 1 0.22 0.2727 0.80 0.0 8 32 40
2 3 2011-01-01 1 0 1 2 0 6 0 1 0.22 0.2727 0.80 0.0 5 27 32

to_datetime

In [10]:
df['dteday'] =  pd.to_datetime(df['dteday'])
df['weekday'] = df.dteday.dt.weekday
df['month'] =df.dteday.dt.month
df['weekofyear'] =df.dteday.dt.weekofyear 
In [11]:
del df['dteday']
In [12]:
print(df.dtypes)
df.head(3)
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
Out[12]:
instant season yr mnth hr holiday weekday workingday weathersit temp atemp hum windspeed casual registered cnt month weekofyear
0 1 1 0 1 0 0 5 0 1 0.24 0.2879 0.81 0.0 3 13 16 1 52
1 2 1 0 1 1 0 5 0 1 0.22 0.2727 0.80 0.0 8 32 40 1 52
2 3 1 0 1 2 0 5 0 1 0.22 0.2727 0.80 0.0 5 27 32 1 52

Encodes text values

Koduje wartości tekstowe

In [13]:
import numpy as np

a,b = df.shape     #<- ile mamy kolumn
b

print('DISCRETE FUNCTIONS CODED')
print('------------------------')
for i in range(1,b):
    i = df.columns[i]
    f = df[i].dtypes
    if f == np.object:
        print(i,"---",f)   
    
        if f == np.object:
        
            df[i] = pd.Categorical(df[i]).codes
        
            continue
DISCRETE FUNCTIONS CODED
------------------------

df[’Time’] = pd.Categorical(df[’Time’]).codes
df[’Time’] = df[’Time’].astype(int)

In [14]:
df.dtypes
Out[14]:
instant         int64
season          int64
yr              int64
mnth            int64
hr              int64
holiday         int64
weekday         int64
workingday      int64
weathersit      int64
temp          float64
atemp         float64
hum           float64
windspeed     float64
casual          int64
registered      int64
cnt             int64
month           int64
weekofyear      int64
dtype: object
In [15]:
df.columns
Out[15]:
Index(['instant', 'season', 'yr', 'mnth', 'hr', 'holiday', 'weekday',
       'workingday', 'weathersit', 'temp', 'atemp', 'hum', 'windspeed',
       'casual', 'registered', 'cnt', 'month', 'weekofyear'],
      dtype='object')

I specify what is X and what is y

Określam co jest X a co y

In [16]:
X = df.drop(['cnt','registered','casual'],1)
y = df['cnt']

Scaling (normalization) of the X value

X should never be too big. Ideally, it should be in the range [-1, 1]. If this is not the case, normalize the input.

Skalowanie (normalizacja) wartości X

X nigdy nie powinien być zbyt duży. Idealnie powinien być w zakresie [-1, 1]. Jeśli tak nie jest, należy znormalizować dane wejściowe.

In [17]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X = sc.fit_transform(X)

print(np.round(X.std(), decimals=2), np.round(X.mean(), decimals=2))
1.0 -0.0
In [18]:
y.value_counts()
Out[18]:
5      260
6      236
4      231
3      224
2      208
      ... 
725      1
709      1
661      1
629      1
887      1
Name: cnt, Length: 869, dtype: int64
In [19]:
y = (y / 100)  # max test score is 100
#print(y.head(3))
print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))
1.81 1.89

Creates random input and output

Tworzy losowe dane wejściowe i wyjściowe

In [20]:
import numpy as np

#X = X.values       #- jak była normalizacja to to nie działa
X = torch.tensor(X)
print(X[:3])
tensor([[-1.7320, -1.3566, -1.0051, -1.6104, -1.6700, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.3346, -1.0933,  0.9474, -1.5539, -1.6104,  1.6913],
        [-1.7318, -1.3566, -1.0051, -1.6104, -1.5254, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913],
        [-1.7316, -1.3566, -1.0051, -1.6104, -1.3807, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913]],
       dtype=torch.float64)
In [21]:
X = X.type(torch.FloatTensor)
print(X[:3])
tensor([[-1.7320, -1.3566, -1.0051, -1.6104, -1.6700, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.3346, -1.0933,  0.9474, -1.5539, -1.6104,  1.6913],
        [-1.7318, -1.3566, -1.0051, -1.6104, -1.5254, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913],
        [-1.7316, -1.3566, -1.0051, -1.6104, -1.3807, -0.1721,  0.9933, -1.4669,
         -0.6652, -1.4385, -1.1817,  0.8955, -1.5539, -1.6104,  1.6913]])
In [22]:
y = y.values   # tworzymy macierz numpy - jak była normalizacja to to nie działa
In [23]:
y = torch.tensor(y)
print(y[:3])
tensor([0.1600, 0.4000, 0.3200], dtype=torch.float64)

TRanspends the resulting vector to become a column

TRansponuje wektor wynikowy aby stał się kolumną

y = y.view(y.shape[0],1)
y[:5]

In [24]:
y = y.type(torch.FloatTensor)

from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
y = sc.fit_transform(y)

print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))

In [25]:
print('X:',X.shape)
print('y:',y.shape)
X: torch.Size([17379, 15])
y: torch.Size([17379])

Dodanie jednego wymiaru do wektora wynikowego

In [26]:
y = y.view(y.shape[0],1)
y.shape
Out[26]:
torch.Size([17379, 1])

Podział na zbiór testowy i zbiór treningowy

In [27]:
a,b = X.shape
a

total_records = a
test_records = int(a * .2)

X_train = X[:total_records-test_records]
X_test = X[total_records-test_records:total_records]

y_train = y[:total_records-test_records]
y_test = y[total_records-test_records:total_records]
In [28]:
print('X_train: ',X_train.shape)
print('X_test:  ',X_test.shape)
print('----------------------------------------------------')
print('y_train: ',y_train.shape)
print('y_test:  ',y_test.shape)
X_train:  torch.Size([13904, 15])
X_test:   torch.Size([3475, 15])
----------------------------------------------------
y_train:  torch.Size([13904, 1])
y_test:   torch.Size([3475, 1])
In [ ]:
 
In [29]:
y_train
Out[29]:
tensor([[0.1600],
        [0.4000],
        [0.3200],
        ...,
        [2.5000],
        [2.1400],
        [2.8300]])
In [ ]:
 

Model

In [30]:
N, D_in = X_train.shape
N, D_out = y_train.shape

H = 10
device = torch.device('cpu')
In [31]:
model = torch.nn.Sequential(
          torch.nn.Linear(D_in, H),
          torch.nn.ReLU(),
          torch.nn.ReLU(),
          torch.nn.Linear(H, D_out),
        ).to(device)

MSE loss function

Funkcja straty MSE

In [32]:
loss_fn = torch.nn.MSELoss(reduction='sum')

Define of learning

Definiowanie nauki

In [33]:
y_pred = model(X_train)
y_pred[:5]
Out[33]:
tensor([[-0.2068],
        [-0.2212],
        [-0.2255],
        [-0.2310],
        [-0.2353]], grad_fn=<SliceBackward>)
In [34]:
learning_rate = 0.00001
epochs = 3000
aggregated_losses = []

for t in range(epochs):
  
   y_pred = model(X_train)
            
 
   loss = loss_fn(y_pred, y_train) # <=# Obliczenie i wydruku straty. Mijamy Tensory zawierające przewidywane i prawdziwe
   
   if t 
      print(t, loss.item())     # <=# wartości y, a funkcja straty zwraca Tensor zawierający stratę.

   aggregated_losses.append(loss) ## potrzebne do wykresu    
  
   model.zero_grad()    #<= # Zeruj gradienty przed uruchomieniem przejścia do tyłu. 
   

   loss.backward()      #<== Przełożenie wsteczne: oblicz gradient gradientu w odniesieniu do wszystkich możliwych do nauczenia się
                                 # parametrów modelu. Wewnętrznie parametry każdego modułu są przechowywane
                                 # w Tensorach z requires_grad=True, więc to wywołanie obliczy gradienty
                                 # wszystkich możliwych do nauczenia parametrów w modelu.
  
   with torch.no_grad():              #<== Zaktualizuj ciężary za pomocą opadania gradientu. Każdy parametr jest tensorem, więc
     for param in model.parameters():         # możemy uzyskać dostęp do jego danych i gradientów tak jak wcześniej.
       param.data -= learning_rate * param.grad
0 88013.1953125
60 22726.017578125
120 20911.103515625
180 20322.68359375
240 17414.154296875
300 16281.208984375
360 15844.0830078125
420 15470.3828125
480 15215.95703125
540 15039.322265625
600 15044.9013671875
660 14982.2099609375
720 15066.326171875
780 15060.884765625
840 15027.4072265625
900 14939.49609375
960 14919.1787109375
1020 14819.20703125
1080 14801.2666015625
1140 14776.3369140625
1200 14679.6806640625
1260 14587.0712890625
1320 14552.8037109375
1380 14547.2509765625
1440 14535.8486328125
1500 14514.3427734375
1560 14504.173828125
1620 14501.9658203125
1680 14488.0185546875
1740 14470.439453125
1800 14459.318359375
1860 14450.2373046875
1920 14441.7578125
1980 14446.7548828125
2040 14439.1953125
2100 14426.0029296875
2160 14417.1171875
2220 14414.8642578125
2280 14407.0009765625
2340 14285.0576171875
2400 14207.20703125
2460 14163.3486328125
2520 14199.5185546875
2580 14156.83984375
2640 14136.0185546875
2700 14182.8544921875
2760 14206.5
2820 14197.958984375
2880 14241.24609375
2940 14252.1787109375

There are many potential reasons. Most likely exploding gradients. The two things to try first:

  • Normalize the inputs
  • Lower the learning rate

Istnieje wiele potencjalnych przyczyn. Najprawdopodobniej wybuchające gradienty. Dwie rzeczy do wypróbowania w pierwszej kolejności:

  • – Normalizuj wejścia
  • – Obniż tempo uczenia msię
In [35]:
import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel('Loss')
plt.xlabel('epoch')
plt.show
Out[35]:
<function matplotlib.pyplot.show(*args, **kw)>

Forecast based on the model

  • substitute the same equations that were in the model
  • The following loss result shows the last model sequence
  • Loss shows how much the model is wrong (loss = sum of error squares) after the last learning sequence

Prognoza na podstawie modelu

  • podstawiamy te same równania, które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
    obraz.png
In [36]:
with torch.no_grad():
    y_pred = model(X_test)  
    loss = (y_pred - y_test).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 8333.57128906

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [37]:
y_pred[:5]
Out[37]:
tensor([[3.6599],
        [4.0918],
        [4.8715],
        [3.9843],
        [5.0055]])

We save the whole model

Zapisujemy cały model

In [38]:
torch.save(model,'/home/wojciech/Pulpit/7/byk15.pb')

We play the whole model

Odtwarzamy cały model

In [39]:
KOT = torch.load('/home/wojciech/Pulpit/7/byk15.pb')
KOT.eval()
Out[39]:
Sequential(
  (0): Linear(in_features=15, out_features=10, bias=True)
  (1): ReLU()
  (2): ReLU()
  (3): Linear(in_features=10, out_features=1, bias=True)
)

By substituting other independent variables, you can get a vector of output variables

We choose a random record from the tensor

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord z tensora

obraz.png

In [40]:
y_pred = y_pred*10
foka = y_pred.cpu().detach().numpy()
df11 = pd.DataFrame(foka)
df11.columns = ['y_pred']
df11=np.round(df11.y_pred)
df11.head(3)
Out[40]:
0    37.0
1    41.0
2    49.0
Name: y_pred, dtype: float32
In [41]:
y_test = y_test*10
foka = y_test.cpu().detach().numpy()
df_t = pd.DataFrame(foka)
df_t.columns = ['y']
df_t.head(3)
Out[41]:
y
0 25.299999
1 26.099998
2 30.599998
In [42]:
NOWA = pd.merge(df_t,df11, how='inner', left_index=True, right_index=True)
NOWA.head(3)
Out[42]:
y y_pred
0 25.299999 37.0
1 26.099998 41.0
2 30.599998 49.0
In [43]:
NOWA.to_csv('/home/wojciech/Pulpit/7/NOWA.csv')
In [44]:
fig, ax = plt.subplots( figsize=(16, 2))
for ewa in ['y', 'y_pred']:
    ax.plot(NOWA, label=ewa)
    
ax.set_xlim(1340, 1500)
#ax.legend()
ax.set_ylabel('Parameter')
ax.set_title('COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
Out[44]:
Text(0.5, 1.0, 'COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
In [45]:
## marginesy
plt.subplots_adjust( left = None , bottom = None , right = None , top = None , wspace = None , hspace = None )
plt.figure(figsize=(16,5))
ax = plt.subplot(1, 2, 1)
NOWA.plot.kde(ax=ax, legend=True, title='Histogram: y vs. y_pred')
NOWA.plot.hist(density=True,bins=40, ax=ax, alpha=0.3)
ax.set_title("Dystributions")

ax = plt.subplot(1, 2, 2)
sns.boxplot(data = NOWA)
plt.xticks(rotation=-90)
ax.set_title("Boxes")


sns.lmplot(data=NOWA, x='y', y='y_pred')
Out[45]:
<seaborn.axisgrid.FacetGrid at 0x7fc07c131750>
<Figure size 432x288 with 0 Axes>

Regression_Assessment

In [46]:
## Robi ocenę tylko dla jednej zmiennej

def Regression_Assessment(y, y_pred):
    
    from sklearn.metrics import r2_score 
    import scipy.stats as stats
    from statsmodels.graphics.gofplots import qqplot
    from matplotlib import pyplot
       
    print('-----two methods--------------')
    SS_Residual = sum((y-y_pred)**2)       
    SS_Total = sum((y-np.mean(y))**2)     
    r_squared = 1 - (float(SS_Residual))/SS_Total
    adjusted_r_squared = 1 - (1-r_squared)*(len(y)-1)/(len(y)-X.shape[1]-1)
    print('r2_score:           
    #print('adjusted_r_squared: 
    #print('----r2_score------secound-method--------')  
    print('r2_score:           
    print()
    print('-------------------------------')
    MAE = (abs(y-y_pred)).mean()
    print('Mean absolute error     MAE:  
    RMSE = np.sqrt(((y-y_pred)**2).mean())
    print('Root mean squared error RMSE: 
    pt = (100*(y-y_pred))/y
    MAPE = (abs(pt)).mean()
    print('Mean absolute error     MAPE: 
    print('-------------------------------')
    
    stat,pvalue0 = stats.ttest_1samp(a=(y-y_pred),popmean=0.0)

    if pvalue0 > 0.01:
        print('t-test H0:suma reszt modelu wynosi zero--')
        print('OK! Resztki modelu nie różnią się od zera - pvalue: 
    else:     
        print('Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 
    print('--------------------------------------------------------------------------------------------') 
  
       
    stat,pvalue2_1 = stats.shapiro(y)
    stat,pvalue2_2 = stats.shapiro(y_pred)

    if pvalue2_1 > 0.01:
        #print('Shapiro-Wilk H0: y maj rozkład normalny?--------------------------------')
        print('OK Shapiro-Wilk! y maja rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 
        print('--------------------------------------------------------------------------------------------')
    if pvalue2_2 > 0.01:
        #print('Shapiro-Wilk: y_pred maj rozkład normalny?--')
        print('OK Shapiro-Wilk! y_pred ma rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 
    
    qqplot(y, line='s')
    pyplot.show()

    qqplot(y_pred, line='s')
    pyplot.show()
       
    print('--------------------------------------------------------------------------------------------')
        
    stat,pvalue3 = stats.kruskal(y_pred,y)
    stat,pvalue4 = stats.f_oneway(y_pred,y)

    if pvalue2_1 < 0.01 or pvalue2_2 < 0.01:
        print('Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA')
     
        if pvalue3 > 0.01:
            print('Kruskal-Wallis NON-PARAMETRIC TEST: czy prognoza i obserwacje empir. mają równe średnie?')
            print('OK! Kruskal-Wallis H0: prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    
    else:

        if pvalue4 > 0.01:
            print('F-test (ANOVA): czy prognoza i obserwacje empir. mają równe średnie?--------------------------------')
            print('OK! prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    print('--------------------------------------------------------------------------------------------')
In [47]:
y = NOWA['y']
y_pred = NOWA['y_pred']

Regression_Assessment(y, y_pred)
-----two methods--------------
r2_score:           0.507
r2_score:           0.507

-------------------------------
Mean absolute error     MAE:  12.80 
Root mean squared error RMSE: 15.48 
Mean absolute error     MAPE: 341.00 
-------------------------------
Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA
Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------

Mean absolute error MAE i RMSE

obraz.png

Percentage errors MAPE

obraz.png

obraz.png

Artykuł Pytorch regression 2.3 [BikeSharing.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
Pytorch regression 2.3 [AirQualityUCI.csv] https://sigmaquality.pl/models/pytorch/pytorch-regression-2-3-airqualityuci-csv-020520201302/ Sat, 02 May 2020 11:04:23 +0000 http://sigmaquality.pl/pytorch-regression-2-3-airqualityuci-csv-020520201302/ 020520201302 Work on diagnostic systems. There is no progress in tank construction without diagnostics. In [1]: import torch I’m starting a GPU graphics card (which I [...]

Artykuł Pytorch regression 2.3 [AirQualityUCI.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
020520201302

Work on diagnostic systems.

There is no progress in tank construction without diagnostics.
obraz.png

In [1]:
import torch

I’m starting a GPU graphics card (which I don’t have)

Odpalam karte graficzną GPU (której nie mam)

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU

Output variables (3):

  • SLUMP (cm)
  • FLOW (cm)
  • 28-day Compressive Strength (Mpa)
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/2/AirQualityUCI.csv', sep=';')
print(df.shape)
df.head(3)
(9471, 17)
Out[3]:
Date Time CO(GT) PT08.S1(CO) NMHC(GT) C6H6(GT) PT08.S2(NMHC) NOx(GT) PT08.S3(NOx) NO2(GT) PT08.S4(NO2) PT08.S5(O3) T RH AH Unnamed: 15 Unnamed: 16
0 10/03/2004 18.00.00 2,6 1360.0 150.0 11,9 1046.0 166.0 1056.0 113.0 1692.0 1268.0 13,6 48,9 0,7578 NaN NaN
1 10/03/2004 19.00.00 2 1292.0 112.0 9,4 955.0 103.0 1174.0 92.0 1559.0 972.0 13,3 47,7 0,7255 NaN NaN
2 10/03/2004 20.00.00 2,2 1402.0 88.0 9,0 939.0 131.0 1140.0 114.0 1555.0 1074.0 11,9 54,0 0,7502 NaN NaN

I fill all holes with values out of range

Wypełniam wszystkie dziury wartościami z poza zakresu

In [4]:
a,b = df.shape     #<- ile mamy kolumn
b

print('NUMBER OF EMPTY RECORDS vs. FULL RECORDS')
print('----------------------------------------')
for i in range(1,b):
    i = df.columns[i]
    r = df[i].isnull().sum()
    h = df[i].count()
    pr = (r/h)*100
   
    if r > 0:
        print(i,"--------",r,"--------",h,"--------",pr) 
NUMBER OF EMPTY RECORDS vs. FULL RECORDS
----------------------------------------
Time -------- 114 -------- 9357 -------- 1.2183392112856686
CO(GT) -------- 114 -------- 9357 -------- 1.2183392112856686
PT08.S1(CO) -------- 114 -------- 9357 -------- 1.2183392112856686
NMHC(GT) -------- 114 -------- 9357 -------- 1.2183392112856686
C6H6(GT) -------- 114 -------- 9357 -------- 1.2183392112856686
PT08.S2(NMHC) -------- 114 -------- 9357 -------- 1.2183392112856686
NOx(GT) -------- 114 -------- 9357 -------- 1.2183392112856686
PT08.S3(NOx) -------- 114 -------- 9357 -------- 1.2183392112856686
NO2(GT) -------- 114 -------- 9357 -------- 1.2183392112856686
PT08.S4(NO2) -------- 114 -------- 9357 -------- 1.2183392112856686
PT08.S5(O3) -------- 114 -------- 9357 -------- 1.2183392112856686
T -------- 114 -------- 9357 -------- 1.2183392112856686
RH -------- 114 -------- 9357 -------- 1.2183392112856686
AH -------- 114 -------- 9357 -------- 1.2183392112856686
Unnamed: 15 -------- 9471 -------- 0 -------- inf
Unnamed: 16 -------- 9471 -------- 0 -------- inf
/home/wojciech/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:10: RuntimeWarning: divide by zero encountered in long_scalars
  # Remove the CWD from sys.path while we load stuff.
In [5]:
import seaborn as sns

sns.heatmap(df.isnull(),yticklabels=False,cbar=False,cmap='viridis')
Out[5]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f69049bb150>
In [6]:
del df['Unnamed: 15']
del df['Unnamed: 16']

df = df.dropna(how='any') # jednak je kasuje te dziury

# df.fillna(-777, inplace=True)
df.isnull().sum()
Out[6]:
Date             0
Time             0
CO(GT)           0
PT08.S1(CO)      0
NMHC(GT)         0
C6H6(GT)         0
PT08.S2(NMHC)    0
NOx(GT)          0
PT08.S3(NOx)     0
NO2(GT)          0
PT08.S4(NO2)     0
PT08.S5(O3)      0
T                0
RH               0
AH               0
dtype: int64
In [7]:
print(df.dtypes)
df.head(3)
Date              object
Time              object
CO(GT)            object
PT08.S1(CO)      float64
NMHC(GT)         float64
C6H6(GT)          object
PT08.S2(NMHC)    float64
NOx(GT)          float64
PT08.S3(NOx)     float64
NO2(GT)          float64
PT08.S4(NO2)     float64
PT08.S5(O3)      float64
T                 object
RH                object
AH                object
dtype: object
Out[7]:
Date Time CO(GT) PT08.S1(CO) NMHC(GT) C6H6(GT) PT08.S2(NMHC) NOx(GT) PT08.S3(NOx) NO2(GT) PT08.S4(NO2) PT08.S5(O3) T RH AH
0 10/03/2004 18.00.00 2,6 1360.0 150.0 11,9 1046.0 166.0 1056.0 113.0 1692.0 1268.0 13,6 48,9 0,7578
1 10/03/2004 19.00.00 2 1292.0 112.0 9,4 955.0 103.0 1174.0 92.0 1559.0 972.0 13,3 47,7 0,7255
2 10/03/2004 20.00.00 2,2 1402.0 88.0 9,0 939.0 131.0 1140.0 114.0 1555.0 1074.0 11,9 54,0 0,7502

to_datetime

In [8]:
df['Date'] =  pd.to_datetime(df['Date'])
df['weekday'] = df.Date.dt.weekday
df['month'] =df.Date.dt.month
df['weekofyear'] =df.Date.dt.weekofyear 
In [9]:
del df['Date']
In [10]:
print(df.dtypes)
df.head(3)
Time              object
CO(GT)            object
PT08.S1(CO)      float64
NMHC(GT)         float64
C6H6(GT)          object
PT08.S2(NMHC)    float64
NOx(GT)          float64
PT08.S3(NOx)     float64
NO2(GT)          float64
PT08.S4(NO2)     float64
PT08.S5(O3)      float64
T                 object
RH                object
AH                object
weekday            int64
month              int64
weekofyear         int64
dtype: object
Out[10]:
Time CO(GT) PT08.S1(CO) NMHC(GT) C6H6(GT) PT08.S2(NMHC) NOx(GT) PT08.S3(NOx) NO2(GT) PT08.S4(NO2) PT08.S5(O3) T RH AH weekday month weekofyear
0 18.00.00 2,6 1360.0 150.0 11,9 1046.0 166.0 1056.0 113.0 1692.0 1268.0 13,6 48,9 0,7578 6 10 40
1 19.00.00 2 1292.0 112.0 9,4 955.0 103.0 1174.0 92.0 1559.0 972.0 13,3 47,7 0,7255 6 10 40
2 20.00.00 2,2 1402.0 88.0 9,0 939.0 131.0 1140.0 114.0 1555.0 1074.0 11,9 54,0 0,7502 6 10 40

Encodes text values

Koduje wartości tekstowe

In [11]:
import numpy as np

a,b = df.shape     #<- ile mamy kolumn
b

print('DISCRETE FUNCTIONS CODED')
print('------------------------')
for i in range(1,b):
    i = df.columns[i]
    f = df[i].dtypes
    if f == np.object:
        print(i,"---",f)   
    
        if f == np.object:
        
            df[i] = pd.Categorical(df[i]).codes
        
            continue
DISCRETE FUNCTIONS CODED
------------------------
CO(GT) --- object
C6H6(GT) --- object
T --- object
RH --- object
AH --- object
In [12]:
df['Time'] = pd.Categorical(df['Time']).codes
df['Time'] = df['Time'].astype(int)
In [13]:
df.dtypes
Out[13]:
Time               int64
CO(GT)              int8
PT08.S1(CO)      float64
NMHC(GT)         float64
C6H6(GT)           int16
PT08.S2(NMHC)    float64
NOx(GT)          float64
PT08.S3(NOx)     float64
NO2(GT)          float64
PT08.S4(NO2)     float64
PT08.S5(O3)      float64
T                  int16
RH                 int16
AH                 int16
weekday            int64
month              int64
weekofyear         int64
dtype: object
In [14]:
df.columns
Out[14]:
Index(['Time', 'CO(GT)', 'PT08.S1(CO)', 'NMHC(GT)', 'C6H6(GT)',
       'PT08.S2(NMHC)', 'NOx(GT)', 'PT08.S3(NOx)', 'NO2(GT)', 'PT08.S4(NO2)',
       'PT08.S5(O3)', 'T', 'RH', 'AH', 'weekday', 'month', 'weekofyear'],
      dtype='object')

I specify what is X and what is y

Określam co jest X a co y

In [15]:
X = df.drop(['CO(GT)'],1)
y = df['CO(GT)']

Scaling (normalization) of the X value

X should never be too big. Ideally, it should be in the range [-1, 1]. If this is not the case, normalize the input.

Skalowanie (normalizacja) wartości X

X nigdy nie powinien być zbyt duży. Idealnie powinien być w zakresie [-1, 1]. Jeśli tak nie jest, należy znormalizować dane wejściowe.

In [16]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X = sc.fit_transform(X)

print(np.round(X.std(), decimals=2), np.round(X.mean(), decimals=2))
1.0 0.0
In [17]:
y.value_counts()
Out[17]:
0      1592
16      279
18      275
17      273
13      262
       ... 
101       1
22        1
102       1
87        1
99        1
Name: CO(GT), Length: 104, dtype: int64
In [18]:
y = (y / 10)  # max test score is 100
#print(y.head(3))
print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))
1.86 2.13

Creates random input and output

Tworzy losowe dane wejściowe i wyjściowe

In [19]:
import numpy as np

#X = X.values       #- jak była normalizacja to to nie działa
X = torch.tensor(X)
print(X[:3])
tensor([[ 0.9391,  0.9430,  2.2112, -1.0982,  0.4423, -0.0102,  0.8106,  0.4321,
          0.6433,  0.6411, -0.9377,  0.0588, -0.6647,  1.4930,  1.0552,  0.8894],
        [ 1.0836,  0.7368,  1.9394,  1.4092,  0.1765, -0.2549,  1.1771,  0.2667,
          0.3586, -0.0067, -0.9624, -0.0061, -0.7528,  1.4930,  1.0552,  0.8894],
        [ 1.2280,  1.0703,  1.7677,  1.3815,  0.1297, -0.1461,  1.0715,  0.4400,
          0.3500,  0.2165, -1.0777,  0.3347, -0.6871,  1.4930,  1.0552,  0.8894]],
       dtype=torch.float64)
In [20]:
X = X.type(torch.FloatTensor)
print(X[:3])
tensor([[ 0.9391,  0.9430,  2.2112, -1.0982,  0.4423, -0.0102,  0.8106,  0.4321,
          0.6433,  0.6411, -0.9377,  0.0588, -0.6647,  1.4930,  1.0552,  0.8894],
        [ 1.0836,  0.7368,  1.9394,  1.4092,  0.1765, -0.2549,  1.1771,  0.2667,
          0.3586, -0.0067, -0.9624, -0.0061, -0.7528,  1.4930,  1.0552,  0.8894],
        [ 1.2280,  1.0703,  1.7677,  1.3815,  0.1297, -0.1461,  1.0715,  0.4400,
          0.3500,  0.2165, -1.0777,  0.3347, -0.6871,  1.4930,  1.0552,  0.8894]])
In [21]:
y = y.values   # tworzymy macierz numpy - jak była normalizacja to to nie działa
In [22]:
y = torch.tensor(y)
print(y[:3])
tensor([3.3000, 2.6000, 2.9000], dtype=torch.float64)

TRanspends the resulting vector to become a column

TRansponuje wektor wynikowy aby stał się kolumną

y = y.view(y.shape[0],1)
y[:5]

In [23]:
y = y.type(torch.FloatTensor)

from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
y = sc.fit_transform(y)

print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))

In [24]:
print('X:',X.shape)
print('y:',y.shape)
X: torch.Size([9357, 16])
y: torch.Size([9357])

Dodanie jednego wymiaru do wektora wynikowego

In [25]:
y = y.view(y.shape[0],1)
y.shape
Out[25]:
torch.Size([9357, 1])

Podział na zbiór testowy i zbiór treningowy

In [26]:
a,b = X.shape
a

total_records = a
test_records = int(a * .2)

X_train = X[:total_records-test_records]
X_test = X[total_records-test_records:total_records]

y_train = y[:total_records-test_records]
y_test = y[total_records-test_records:total_records]
In [27]:
print('X_train: ',X_train.shape)
print('X_test:  ',X_test.shape)
print('----------------------------------------------------')
print('y_train: ',y_train.shape)
print('y_test:  ',y_test.shape)
X_train:  torch.Size([7486, 16])
X_test:   torch.Size([1871, 16])
----------------------------------------------------
y_train:  torch.Size([7486, 1])
y_test:   torch.Size([1871, 1])

Model

In [28]:
N, D_in = X_train.shape
N, D_out = y_train.shape

H = 100
device = torch.device('cpu')
In [29]:
model = torch.nn.Sequential(
          torch.nn.Linear(D_in, H),
          torch.nn.ReLU(),
          torch.nn.ReLU(),
          torch.nn.Linear(H, D_out),
        ).to(device)

MSE loss function

Funkcja straty MSE

In [30]:
loss_fn = torch.nn.MSELoss(reduction='sum')

Define of learning

Definiowanie nauki

In [31]:
y_pred = model(X_train)
y_pred[:5]
Out[31]:
tensor([[-0.2812],
        [-0.1326],
        [-0.1032],
        [-0.0952],
        [-0.1497]], grad_fn=<SliceBackward>)
In [32]:
learning_rate = 0.00001
epochs = 3000
aggregated_losses = []

for t in range(epochs):
  
   y_pred = model(X_train)
            
 
   loss = loss_fn(y_pred, y_train) # <=# Obliczenie i wydruku straty. Mijamy Tensory zawierające przewidywane i prawdziwe
   
   if t 
      print(t, loss.item())     # <=# wartości y, a funkcja straty zwraca Tensor zawierający stratę.

   aggregated_losses.append(loss) ## potrzebne do wykresu    
  
   model.zero_grad()    #<= # Zeruj gradienty przed uruchomieniem przejścia do tyłu. 
   

   loss.backward()      #<== Przełożenie wsteczne: oblicz gradient gradientu w odniesieniu do wszystkich możliwych do nauczenia się
                                 # parametrów modelu. Wewnętrznie parametry każdego modułu są przechowywane
                                 # w Tensorach z requires_grad=True, więc to wywołanie obliczy gradienty
                                 # wszystkich możliwych do nauczenia parametrów w modelu.
  
   with torch.no_grad():              #<== Zaktualizuj ciężary za pomocą opadania gradientu. Każdy parametr jest tensorem, więc
     for param in model.parameters():         # możemy uzyskać dostęp do jego danych i gradientów tak jak wcześniej.
       param.data -= learning_rate * param.grad
0 57620.95703125
300 3887.6298828125
600 3504.96484375
900 3247.74169921875
1200 3244.993896484375
1500 3150.48681640625
1800 3007.068603515625
2100 2773.484619140625
2400 3253.34619140625
2700 2737.288330078125

There are many potential reasons. Most likely exploding gradients. The two things to try first:

  • Normalize the inputs
  • Lower the learning rate

Istnieje wiele potencjalnych przyczyn. Najprawdopodobniej wybuchające gradienty. Dwie rzeczy do wypróbowania w pierwszej kolejności:

  • – Normalizuj wejścia
  • – Obniż tempo uczenia msię
In [33]:
import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel('Loss')
plt.xlabel('epoch')
plt.show
Out[33]:
<function matplotlib.pyplot.show(*args, **kw)>

Forecast based on the model

  • substitute the same equations that were in the model
  • The following loss result shows the last model sequence
  • Loss shows how much the model is wrong (loss = sum of error squares) after the last learning sequence

Prognoza na podstawie modelu

  • podstawiamy te same równania, które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
    obraz.png
In [34]:
with torch.no_grad():
    y_pred = model(X_test)  
    loss = (y_pred - y_test).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 1491.50219727

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [35]:
y_pred[:5]
Out[35]:
tensor([[1.6469],
        [1.7758],
        [1.4006],
        [2.5196],
        [2.7246]])

We save the whole model

Zapisujemy cały model

In [36]:
torch.save(model,'/home/wojciech/Pulpit/7/byk15.pb')

We play the whole model

Odtwarzamy cały model

In [37]:
KOT = torch.load('/home/wojciech/Pulpit/7/byk15.pb')
KOT.eval()
Out[37]:
Sequential(
  (0): Linear(in_features=16, out_features=100, bias=True)
  (1): ReLU()
  (2): ReLU()
  (3): Linear(in_features=100, out_features=1, bias=True)
)

By substituting other independent variables, you can get a vector of output variables

We choose a random record from the tensor

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord z tensora

obraz.png

In [38]:
y_pred = y_pred*10
foka = y_pred.cpu().detach().numpy()
df11 = pd.DataFrame(foka)
df11.columns = ['y_pred']
df11=np.round(df11.y_pred)
df11.head(3)
Out[38]:
0    16.0
1    18.0
2    14.0
Name: y_pred, dtype: float32
In [39]:
y_test = y_test*10
foka = y_test.cpu().detach().numpy()
df_t = pd.DataFrame(foka)
df_t.columns = ['y']
df_t.head(3)
Out[39]:
y
0 13.0
1 13.0
2 14.0
In [40]:
NOWA = pd.merge(df_t,df11, how='inner', left_index=True, right_index=True)
NOWA.head(3)
Out[40]:
y y_pred
0 13.0 16.0
1 13.0 18.0
2 14.0 14.0
In [41]:
NOWA.to_csv('/home/wojciech/Pulpit/7/NOWA.csv')
In [42]:
fig, ax = plt.subplots( figsize=(16, 2))
for ewa in ['y', 'y_pred']:
    ax.plot(NOWA, label=ewa)
    
ax.set_xlim(1340, 1500)
#ax.legend()
ax.set_ylabel('Parameter')
ax.set_title('COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
Out[42]:
Text(0.5, 1.0, 'COURSE OF THE PROJECTING PROCESS ON THE TEST SET')
In [43]:
## marginesy
plt.subplots_adjust( left = None , bottom = None , right = None , top = None , wspace = None , hspace = None )
plt.figure(figsize=(16,5))
ax = plt.subplot(1, 2, 1)
NOWA.plot.kde(ax=ax, legend=True, title='Histogram: y vs. y_pred')
NOWA.plot.hist(density=True,bins=40, ax=ax, alpha=0.3)
ax.set_title("Dystributions")

ax = plt.subplot(1, 2, 2)
sns.boxplot(data = NOWA)
plt.xticks(rotation=-90)
ax.set_title("Boxes")


sns.lmplot(data=NOWA, x='y', y='y_pred')
Out[43]:
<seaborn.axisgrid.FacetGrid at 0x7f68fac9b850>
<Figure size 432x288 with 0 Axes>

Regression_Assessment

In [44]:
## Robi ocenę tylko dla jednej zmiennej

def Regression_Assessment(y, y_pred):
    
    from sklearn.metrics import r2_score 
    import scipy.stats as stats
    from statsmodels.graphics.gofplots import qqplot
    from matplotlib import pyplot
       
    print('-----two methods--------------')
    SS_Residual = sum((y-y_pred)**2)       
    SS_Total = sum((y-np.mean(y))**2)     
    r_squared = 1 - (float(SS_Residual))/SS_Total
    adjusted_r_squared = 1 - (1-r_squared)*(len(y)-1)/(len(y)-X.shape[1]-1)
    print('r2_score:           
    #print('adjusted_r_squared: 
    #print('----r2_score------secound-method--------')  
    print('r2_score:           
    print()
    print('-------------------------------')
    MAE = (abs(y-y_pred)).mean()
    print('Mean absolute error     MAE:  
    RMSE = np.sqrt(((y-y_pred)**2).mean())
    print('Root mean squared error RMSE: 
    pt = (100*(y-y_pred))/y
    MAPE = (abs(pt)).mean()
    print('Mean absolute error     MAPE: 
    print('-------------------------------')
    
    stat,pvalue0 = stats.ttest_1samp(a=(y-y_pred),popmean=0.0)

    if pvalue0 > 0.01:
        print('t-test H0:suma reszt modelu wynosi zero--')
        print('OK! Resztki modelu nie różnią się od zera - pvalue: 
    else:     
        print('Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 
    print('--------------------------------------------------------------------------------------------') 
  
       
    stat,pvalue2_1 = stats.shapiro(y)
    stat,pvalue2_2 = stats.shapiro(y_pred)

    if pvalue2_1 > 0.01:
        #print('Shapiro-Wilk H0: y maj rozkład normalny?--------------------------------')
        print('OK Shapiro-Wilk! y maja rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 
        print('--------------------------------------------------------------------------------------------')
    if pvalue2_2 > 0.01:
        #print('Shapiro-Wilk: y_pred maj rozkład normalny?--')
        print('OK Shapiro-Wilk! y_pred ma rozkład normalny - pvalue: 
    else:     
        print('Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 
    
    qqplot(y, line='s')
    pyplot.show()

    qqplot(y_pred, line='s')
    pyplot.show()
       
    print('--------------------------------------------------------------------------------------------')
        
    stat,pvalue3 = stats.kruskal(y_pred,y)
    stat,pvalue4 = stats.f_oneway(y_pred,y)

    if pvalue2_1 < 0.01 or pvalue2_2 < 0.01:
        print('Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA')
     
        if pvalue3 > 0.01:
            print('Kruskal-Wallis NON-PARAMETRIC TEST: czy prognoza i obserwacje empir. mają równe średnie?')
            print('OK! Kruskal-Wallis H0: prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    
    else:

        if pvalue4 > 0.01:
            print('F-test (ANOVA): czy prognoza i obserwacje empir. mają równe średnie?--------------------------------')
            print('OK! prognoza i obserwacje empir. mają równe średnie - pvalue: 
        else:     
            print('Źle - prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 
    print('--------------------------------------------------------------------------------------------')
In [45]:
y = NOWA['y']
y_pred = NOWA['y_pred']

Regression_Assessment(y, y_pred)
-----two methods--------------
r2_score:           0.723
r2_score:           0.723

-------------------------------
Mean absolute error     MAE:  6.38 
Root mean squared error RMSE: 8.93 
Mean absolute error     MAPE: inf 
-------------------------------
Źle - Resztki modelu RÓŻNIĄ SIĘ OD ZERA - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk - y NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Źle Shapiro-Wilk y_pred NIE MA ROZKŁADU NORMALNEGO - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------
Shapiro-Wilk: Zmienne nie mają rozkładu normalnego! Nie można zrobić analizy ANOVA
Źle - Kruskal-Wallis: prognoza i obserwacje empir. NIE MAJĄ równych średnich - pvalue: 0.0000 < 0.01 (Odrzucamy H0)
--------------------------------------------------------------------------------------------

Mean absolute error MAE i RMSE

obraz.png

Percentage errors MAPE

obraz.png

obraz.png

Artykuł Pytorch regression 2.3 [AirQualityUCI.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
Pytorch regression _2.1_ [WorldHappinessReport.csv] https://sigmaquality.pl/models/pytorch/pytorch-regression-_2-1_-worldhappinessreport-csv-300420201044/ Thu, 30 Apr 2020 08:47:51 +0000 http://sigmaquality.pl/pytorch-regression-_2-1_-worldhappinessreport-csv-300420201044/ 300420201044 https://github.com/jcjohnson/pytorch-examples#pytorch-custom-nn-modules In [1]: import torch I’m starting a GPU graphics card (which I don’t have) Odpalam karte graficzną GPU (której nie mam) In [2]: device = [...]

Artykuł Pytorch regression _2.1_ [WorldHappinessReport.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
300420201044

https://github.com/jcjohnson/pytorch-examples#pytorch-custom-nn-modules

In [1]:
import torch

I’m starting a GPU graphics card (which I don’t have)

Odpalam karte graficzną GPU (której nie mam)

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/1/WorldHappinessReport.csv')
df.head(3)
Out[3]:
Unnamed: 0 Country Region Happiness Rank Happiness Score Economy (GDP per Capita) Family Health (Life Expectancy) Freedom Trust (Government Corruption) Generosity Dystopia Residual Year
0 0 Afghanistan Southern Asia 153.0 3.575 0.31982 0.30285 0.30335 0.23414 0.09719 0.36510 1.95210 2015.0
1 1 Albania Central and Eastern Europe 95.0 4.959 0.87867 0.80434 0.81325 0.35733 0.06413 0.14272 1.89894 2015.0
2 2 Algeria Middle East and Northern Africa 68.0 5.605 0.93929 1.07772 0.61766 0.28579 0.17383 0.07822 2.43209 2015.0

I fill all holes with values out of range

Wypełniam wszystkie dziury wartościami z poza zakresu

In [4]:
del df['Unnamed: 0'] 
In [5]:
df = df.dropna(how='any')

# df.fillna(-777, inplace=True)
df.isnull().sum()
Out[5]:
Country                          0
Region                           0
Happiness Rank                   0
Happiness Score                  0
Economy (GDP per Capita)         0
Family                           0
Health (Life Expectancy)         0
Freedom                          0
Trust (Government Corruption)    0
Generosity                       0
Dystopia Residual                0
Year                             0
dtype: int64
In [6]:
print(df.dtypes)
df.head(3)
Country                           object
Region                            object
Happiness Rank                   float64
Happiness Score                  float64
Economy (GDP per Capita)         float64
Family                           float64
Health (Life Expectancy)         float64
Freedom                          float64
Trust (Government Corruption)    float64
Generosity                       float64
Dystopia Residual                float64
Year                             float64
dtype: object
Out[6]:
Country Region Happiness Rank Happiness Score Economy (GDP per Capita) Family Health (Life Expectancy) Freedom Trust (Government Corruption) Generosity Dystopia Residual Year
0 Afghanistan Southern Asia 153.0 3.575 0.31982 0.30285 0.30335 0.23414 0.09719 0.36510 1.95210 2015.0
1 Albania Central and Eastern Europe 95.0 4.959 0.87867 0.80434 0.81325 0.35733 0.06413 0.14272 1.89894 2015.0
2 Algeria Middle East and Northern Africa 68.0 5.605 0.93929 1.07772 0.61766 0.28579 0.17383 0.07822 2.43209 2015.0

Encodes text values

Koduje wartości tekstowe

In [7]:
import numpy as np

a,b = df.shape     #<- ile mamy kolumn
b

print('DISCRETE FUNCTIONS CODED')
print('------------------------')
for i in range(1,b):
    i = df.columns[i]
    f = df[i].dtypes
    if f == np.object:
        print(i,"---",f)   
    
        if f == np.object:
        
            df[i] = pd.Categorical(df[i]).codes
        
            continue
DISCRETE FUNCTIONS CODED
------------------------
Region --- object
In [8]:
df['Country'] = pd.Categorical(df['Country']).codes
df['Country'] = df['Country'].astype(int)
In [9]:
df.dtypes
Out[9]:
Country                            int64
Region                              int8
Happiness Rank                   float64
Happiness Score                  float64
Economy (GDP per Capita)         float64
Family                           float64
Health (Life Expectancy)         float64
Freedom                          float64
Trust (Government Corruption)    float64
Generosity                       float64
Dystopia Residual                float64
Year                             float64
dtype: object

I specify what is X and what is y

Określam co jest X a co y

In [10]:
X = df.drop('Happiness Score',axis=1)
y =df['Happiness Score']

Scaling (normalization) of the X value

X should never be too big. Ideally, it should be in the range [-1, 1]. If this is not the case, normalize the input.

Skalowanie (normalizacja) wartości X

X nigdy nie powinien być zbyt duży. Idealnie powinien być w zakresie [-1, 1]. Jeśli tak nie jest, należy znormalizować dane wejściowe.

In [11]:
from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
X = sc.fit_transform(X)

print(np.round(X.std(), decimals=2), np.round(X.mean(), decimals=2))
1.0 0.0
In [12]:
y = y / 100  # max test score is 100
#print(y.head(3))
print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))
0.01 0.05

Creates random input and output

Tworzy losowe dane wejściowe i wyjściowe

In [13]:
import numpy as np

#X = X.values       #- jak była normalizacja to to nie działa
X = torch.tensor(X)
print(X[:3])
tensor([[-1.7017,  0.6348,  1.6377, -1.4634, -2.1570, -1.1514, -1.1211, -0.3355,
          0.9371, -0.2564, -1.2157],
        [-1.6806, -1.3723,  0.3567, -0.1154, -0.5823,  0.9770, -0.3015, -0.6331,
         -0.7552, -0.3511, -1.2157],
        [-1.6595, -0.3688, -0.2395,  0.0309,  0.2762,  0.1606, -0.7774,  0.3545,
         -1.2461,  0.5988, -1.2157]], dtype=torch.float64)
In [14]:
X = X.type(torch.FloatTensor)
print(X[:3])
tensor([[-1.7017,  0.6348,  1.6377, -1.4634, -2.1570, -1.1514, -1.1211, -0.3355,
          0.9371, -0.2564, -1.2157],
        [-1.6806, -1.3723,  0.3567, -0.1154, -0.5823,  0.9770, -0.3015, -0.6331,
         -0.7552, -0.3511, -1.2157],
        [-1.6595, -0.3688, -0.2395,  0.0309,  0.2762,  0.1606, -0.7774,  0.3545,
         -1.2461,  0.5988, -1.2157]])
In [ ]:
 
In [15]:
y = y.values   # tworzymy macierz numpy - jak była normalizacja to to nie działa
In [16]:
y = torch.tensor(y)
print(y[:3])
tensor([0.0358, 0.0496, 0.0561], dtype=torch.float64)

TRanspends the resulting vector to become a column

TRansponuje wektor wynikowy aby stał się kolumną

In [17]:
y = y.view(y.shape[0],1)
y[:5]
Out[17]:
tensor([[0.0358],
        [0.0496],
        [0.0561],
        [0.0403],
        [0.0657]], dtype=torch.float64)
In [18]:
y = y.type(torch.FloatTensor)

from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
y = sc.fit_transform(y)

print(np.round(y.std(), decimals=2), np.round(y.mean(), decimals=2))

In [19]:
print('X:',X.shape)
print('y:',y.shape)
X: torch.Size([469, 11])
y: torch.Size([469, 1])

Model

In [20]:
N, D_in = X.shape
N, D_out = y.shape

H = 30
device = torch.device('cpu')
In [21]:
model = torch.nn.Sequential(
          torch.nn.Linear(D_in, H),
          torch.nn.ReLU(),
          torch.nn.Linear(H, D_out),
        ).to(device)

MSE loss function

Funkcja straty MSE

In [22]:
loss_fn = torch.nn.MSELoss(reduction='sum')

Define of learning

Definiowanie nauki

In [23]:
y_pred = model(X)
y_pred
Out[23]:
tensor([[-4.2745e-01],
        [-3.6429e-03],
        [ 3.1600e-02],
        [-2.2681e-01],
        [-1.1513e-02],
        [-1.9216e-02],
        [-1.2311e-01],
        [-1.6105e-02],
        [ 5.4885e-02],
        [ 9.3276e-02],
        [-2.9303e-01],
        [ 1.1384e-01],
        [-4.6793e-03],
        [-3.5003e-01],
        [ 3.0583e-02],
        [-2.9466e-02],
        [-3.8709e-02],
        [-2.5406e-01],
        [-3.9137e-02],
        [ 1.5388e-01],
        [-1.7763e-01],
        [-4.0004e-01],
        [-7.0890e-02],
        [-3.0126e-01],
        [ 3.5626e-02],
        [-5.6126e-01],
        [-3.2454e-01],
        [ 6.9069e-03],
        [ 3.5522e-02],
        [-1.2959e-02],
        [-1.5253e-01],
        [-2.5984e-01],
        [-4.0041e-01],
        [-5.3569e-02],
        [-1.1350e-01],
        [ 9.3323e-03],
        [-1.2873e-02],
        [ 7.2967e-02],
        [-1.7973e-01],
        [ 4.2126e-02],
        [ 6.5027e-02],
        [-4.2669e-02],
        [ 1.8619e-02],
        [ 5.1381e-02],
        [-2.5496e-01],
        [ 6.6168e-02],
        [ 5.5571e-02],
        [-1.2586e-01],
        [-7.1015e-02],
        [ 2.4031e-02],
        [-2.3470e-01],
        [-1.3440e-01],
        [-2.7763e-02],
        [-3.9259e-01],
        [-2.1299e-01],
        [ 5.2305e-02],
        [-2.6931e-02],
        [ 1.1370e-01],
        [ 2.0505e-02],
        [-2.2530e-01],
        [-3.1724e-02],
        [-7.1249e-02],
        [-1.0609e-01],
        [ 7.5583e-02],
        [ 4.1280e-02],
        [ 1.3634e-01],
        [-3.2085e-01],
        [ 2.0309e-03],
        [-6.4143e-02],
        [ 8.2991e-02],
        [ 1.4143e-02],
        [-1.6307e-01],
        [-1.0309e-01],
        [-4.9232e-02],
        [-1.1677e-01],
        [ 1.1351e-02],
        [ 6.3761e-02],
        [ 5.0787e-02],
        [-3.0608e-01],
        [-3.4375e-01],
        [ 8.4144e-02],
        [-1.4717e-02],
        [ 2.7701e-03],
        [ 1.3630e-02],
        [-2.3664e-01],
        [-4.8505e-01],
        [ 5.1280e-02],
        [-2.8454e-01],
        [ 3.6705e-02],
        [-1.4809e-01],
        [ 2.8126e-02],
        [ 7.9958e-02],
        [-1.6470e-01],
        [-6.7994e-02],
        [-3.5120e-02],
        [-1.1673e-01],
        [-3.4031e-01],
        [-1.6971e-01],
        [-1.3144e-01],
        [ 1.3408e-01],
        [-8.7675e-02],
        [-4.2872e-03],
        [-3.4742e-01],
        [-3.2806e-01],
        [ 1.1534e-01],
        [ 8.0059e-03],
        [-2.1437e-02],
        [-2.8561e-01],
        [ 1.9570e-02],
        [ 4.2293e-02],
        [-9.1104e-02],
        [-3.6440e-02],
        [-1.9679e-01],
        [-9.4737e-02],
        [-3.7697e-02],
        [-7.7243e-02],
        [-6.1722e-02],
        [-9.0355e-02],
        [-1.8373e-02],
        [ 1.5134e-01],
        [-1.8642e-01],
        [-3.2258e-02],
        [-4.8399e-01],
        [ 1.3403e-01],
        [-2.6324e-03],
        [-1.4263e-01],
        [ 1.1676e-01],
        [-1.5671e-01],
        [-1.2794e-02],
        [ 8.9764e-02],
        [-1.1662e-01],
        [-1.2209e-01],
        [ 3.5171e-02],
        [-3.4670e-01],
        [ 9.6369e-02],
        [ 3.5116e-02],
        [ 1.1252e-01],
        [ 1.8931e-02],
        [-2.9094e-01],
        [-2.5447e-01],
        [-9.2450e-02],
        [-3.7711e-01],
        [-8.0330e-02],
        [-1.0223e-01],
        [ 3.7966e-02],
        [-6.4997e-02],
        [-3.4390e-01],
        [-1.6943e-01],
        [-4.6404e-02],
        [ 1.6054e-01],
        [ 4.9439e-02],
        [-1.3906e-02],
        [-2.0814e-01],
        [ 3.2258e-02],
        [-1.5231e-01],
        [-2.8329e-01],
        [-2.1860e-01],
        [-2.4882e-01],
        [-5.0408e-01],
        [-9.2120e-02],
        [-2.7293e-01],
        [-4.1514e-01],
        [-1.0440e-01],
        [-1.3683e-01],
        [-1.2180e-01],
        [-2.2596e-01],
        [-1.2288e-02],
        [-9.5664e-02],
        [-3.5454e-01],
        [-3.7126e-04],
        [-2.1165e-01],
        [-7.6128e-02],
        [-4.0232e-01],
        [-3.9331e-02],
        [-3.5365e-02],
        [-1.9719e-01],
        [-1.2559e-01],
        [-1.2059e-01],
        [ 1.1621e-01],
        [-2.5765e-01],
        [-5.2664e-01],
        [-1.2323e-01],
        [-3.3721e-01],
        [-1.0469e-01],
        [-4.3728e-01],
        [-1.0070e-01],
        [ 1.8569e-02],
        [-3.4372e-02],
        [-3.0936e-01],
        [-2.6187e-01],
        [-4.2382e-01],
        [-4.3243e-02],
        [-1.5634e-01],
        [-1.7728e-01],
        [-2.3042e-02],
        [-1.4630e-01],
        [ 9.1151e-02],
        [-2.4256e-02],
        [-1.9366e-01],
        [-7.1570e-02],
        [ 9.4558e-02],
        [-3.4334e-01],
        [-1.1295e-01],
        [-1.2794e-01],
        [-2.0222e-01],
        [-1.5829e-01],
        [-1.2256e-01],
        [-2.5393e-01],
        [-3.0474e-01],
        [-1.8759e-02],
        [-4.3256e-01],
        [-3.7892e-01],
        [-6.8839e-02],
        [-8.8371e-03],
        [ 4.7923e-03],
        [-8.7923e-02],
        [-2.4323e-01],
        [-1.3643e-01],
        [-1.3913e-01],
        [-1.2775e-01],
        [-9.1886e-02],
        [-3.2819e-02],
        [-6.7460e-02],
        [-3.2325e-01],
        [ 2.9318e-02],
        [-3.5069e-02],
        [-1.7418e-02],
        [ 6.5475e-02],
        [-2.6579e-01],
        [-1.8996e-01],
        [-4.6490e-02],
        [-1.3474e-01],
        [-1.3473e-01],
        [ 1.5172e-02],
        [-9.3274e-02],
        [-4.2761e-01],
        [ 5.3500e-02],
        [-2.9554e-02],
        [-1.3342e-01],
        [ 9.8934e-03],
        [-3.9405e-01],
        [-4.8164e-01],
        [-1.5648e-02],
        [-2.8614e-01],
        [-4.0472e-02],
        [-1.9807e-01],
        [-9.5416e-03],
        [ 2.7897e-02],
        [-1.5682e-01],
        [ 1.7725e-02],
        [-1.0876e-01],
        [-2.4430e-01],
        [-1.7526e-01],
        [-1.6604e-01],
        [-2.1545e-01],
        [ 4.2694e-03],
        [-1.1699e-01],
        [-9.5717e-03],
        [-3.3002e-01],
        [-4.2078e-01],
        [ 3.9176e-03],
        [-4.7472e-02],
        [-3.7422e-01],
        [-1.2959e-01],
        [ 5.5540e-02],
        [-2.9904e-02],
        [-1.7143e-02],
        [-3.9630e-02],
        [-1.4988e-03],
        [-3.3803e-02],
        [ 4.2111e-02],
        [-1.4506e-01],
        [-2.0571e-02],
        [-1.7663e-02],
        [-1.1568e-01],
        [ 7.8986e-02],
        [-2.1377e-01],
        [-5.9858e-02],
        [-4.7240e-01],
        [-1.3849e-02],
        [-9.6779e-03],
        [-2.2012e-02],
        [-2.2544e-01],
        [-1.2726e-01],
        [-1.6832e-01],
        [-2.2275e-02],
        [-4.5899e-01],
        [ 7.5663e-02],
        [-8.2426e-02],
        [-3.2082e-01],
        [ 6.7503e-02],
        [ 1.9751e-02],
        [-1.8723e-02],
        [-9.1598e-02],
        [ 2.2715e-02],
        [-2.2554e-01],
        [-2.8882e-01],
        [-1.3291e-01],
        [-3.9947e-01],
        [-2.4364e-02],
        [-1.6493e-01],
        [-2.1627e-02],
        [ 1.0300e-02],
        [-3.8896e-01],
        [-6.3563e-02],
        [-3.2330e-02],
        [ 6.4565e-02],
        [ 2.2611e-02],
        [ 2.0242e-02],
        [-1.9253e-01],
        [-3.0338e-02],
        [-1.4017e-01],
        [-2.6179e-01],
        [-2.6843e-01],
        [-3.8787e-01],
        [-3.5069e-01],
        [ 1.5691e-01],
        [-7.9159e-02],
        [-1.3093e-01],
        [-8.4490e-02],
        [ 1.2835e-01],
        [-2.8785e-01],
        [-2.7760e-01],
        [ 1.2939e-01],
        [-1.2159e-01],
        [-4.6848e-03],
        [ 1.1916e-01],
        [-2.7076e-01],
        [ 7.8918e-03],
        [-2.1981e-01],
        [-6.0121e-02],
        [-1.7439e-02],
        [ 7.2381e-02],
        [-1.0266e-01],
        [-6.7563e-02],
        [ 2.5719e-01],
        [-7.8539e-02],
        [-2.9228e-01],
        [-5.9970e-02],
        [-1.7658e-01],
        [-2.2883e-01],
        [-5.3707e-01],
        [-2.8134e-01],
        [-7.9801e-02],
        [ 5.3754e-02],
        [-3.9891e-02],
        [-5.8873e-02],
        [-1.5668e-01],
        [-1.1001e-01],
        [ 7.7573e-02],
        [-1.3546e-01],
        [-1.5186e-01],
        [-2.6282e-01],
        [-3.5541e-02],
        [ 3.5792e-02],
        [ 1.1476e-01],
        [ 4.4816e-02],
        [-5.3945e-02],
        [-6.5655e-02],
        [-2.3092e-01],
        [-1.7522e-01],
        [-5.9290e-02],
        [ 6.0194e-02],
        [-2.2226e-01],
        [-8.5102e-02],
        [-1.0436e-01],
        [-5.1286e-02],
        [-1.8722e-01],
        [-2.4780e-01],
        [ 1.2161e-01],
        [ 1.5192e-01],
        [-2.8800e-01],
        [ 4.1241e-02],
        [-1.8488e-01],
        [ 3.2783e-02],
        [ 1.5595e-01],
        [-2.3525e-01],
        [-1.3962e-01],
        [-8.9306e-02],
        [-1.8592e-01],
        [-1.2561e-03],
        [-1.9509e-01],
        [ 9.9536e-02],
        [-3.7024e-02],
        [-1.2256e-01],
        [ 5.7584e-02],
        [-1.6568e-01],
        [-6.1208e-02],
        [ 1.3184e-02],
        [ 1.0873e-01],
        [-2.0101e-01],
        [-2.3157e-01],
        [-5.2749e-02],
        [ 5.3078e-02],
        [-2.7642e-01],
        [ 1.1204e-01],
        [-1.2318e-01],
        [-2.4445e-01],
        [-1.3769e-01],
        [-7.3394e-02],
        [-2.5623e-01],
        [-3.1583e-03],
        [-9.2862e-02],
        [-2.1318e-02],
        [-5.9814e-02],
        [-6.7896e-02],
        [ 1.3920e-01],
        [ 2.1923e-02],
        [-2.5978e-01],
        [-1.5678e-01],
        [-1.4222e-01],
        [-1.7531e-02],
        [-1.8887e-01],
        [-3.4119e-01],
        [-6.7630e-03],
        [-1.6711e-01],
        [-1.9834e-01],
        [-7.1525e-02],
        [-2.5377e-01],
        [-2.7868e-01],
        [ 4.1292e-02],
        [-2.2371e-01],
        [-1.3157e-01],
        [-4.2236e-02],
        [-1.5551e-01],
        [-2.3040e-01],
        [-1.7524e-01],
        [-3.6064e-01],
        [-1.9167e-02],
        [-1.1715e-01],
        [-1.2774e-01],
        [-1.4843e-01],
        [-2.8300e-02],
        [ 3.3448e-02],
        [-3.5775e-01],
        [-2.6867e-01],
        [-7.0971e-02],
        [-2.8985e-01],
        [-2.7948e-01],
        [-2.4863e-01],
        [-1.3089e-02],
        [-3.3796e-01],
        [-1.9001e-01],
        [-2.2440e-01],
        [-1.2782e-01],
        [-1.8728e-01],
        [-2.1334e-01],
        [-8.4375e-02],
        [-1.5037e-01],
        [-2.0170e-01],
        [-1.6970e-01],
        [-3.8650e-01],
        [-3.0448e-01],
        [-2.8647e-01],
        [-2.9811e-02],
        [-1.0393e-01],
        [-1.6981e-01],
        [-2.5891e-01],
        [-6.0411e-02],
        [-2.9120e-01],
        [-2.3515e-01],
        [-3.4595e-01],
        [-3.2688e-01],
        [-4.6198e-01],
        [-7.3562e-02],
        [-2.4096e-01],
        [-1.1416e-01],
        [-2.0195e-01],
        [-2.3524e-01]], grad_fn=<AddmmBackward>)
In [24]:
learning_rate = 1e-4
epochs = 2500
aggregated_losses = []

for t in range(epochs):
  
   y_pred = model(X)
            
 
   loss = loss_fn(y_pred, y) # <=# Obliczenie i wydruku straty. Mijamy Tensory zawierające przewidywane i prawdziwe
   print(t, loss.item())     # <=# wartości y, a funkcja straty zwraca Tensor zawierający stratę.
   aggregated_losses.append(loss) ## potrzebne do wykresu    
  
   model.zero_grad()    #<= # Zeruj gradienty przed uruchomieniem przejścia do tyłu. 
   

   loss.backward()      #<== Przełożenie wsteczne: oblicz gradient gradientu w odniesieniu do wszystkich możliwych do nauczenia się
                                 # parametrów modelu. Wewnętrznie parametry każdego modułu są przechowywane
                                 # w Tensorach z requires_grad=True, więc to wywołanie obliczy gradienty
                                 # wszystkich możliwych do nauczenia parametrów w modelu.
  
   with torch.no_grad():              #<== Zaktualizuj ciężary za pomocą opadania gradientu. Każdy parametr jest tensorem, więc
     for param in model.parameters():         # możemy uzyskać dostęp do jego danych i gradientów tak jak wcześniej.
       param.data -= learning_rate * param.grad
0 22.952877044677734
1 13.857816696166992
2 9.691527366638184
3 7.705139636993408
4 6.698968887329102
5 6.140297889709473
6 5.791937828063965
7 5.546450614929199
8 5.3543548583984375
9 5.19241189956665
10 5.049210548400879
11 4.918918609619141
12 4.798333168029785
13 4.685548305511475
14 4.579348087310791
15 4.478791236877441
16 4.383253574371338
17 4.292238235473633
18 4.205319881439209
19 4.122212886810303
20 4.042584419250488
21 3.9661753177642822
22 3.8927762508392334
23 3.8221662044525146
24 3.754176616668701
25 3.6886744499206543
26 3.625523567199707
27 3.5645980834960938
28 3.505775213241577
29 3.4489235877990723
30 3.393965482711792
31 3.340808868408203
32 3.2893426418304443
33 3.2395153045654297
34 3.191263198852539
35 3.1444873809814453
36 3.099121332168579
37 3.0551016330718994
38 3.0123729705810547
39 2.970889091491699
40 2.9305853843688965
41 2.8914124965667725
42 2.8533215522766113
43 2.8162498474121094
44 2.7801461219787598
45 2.7450110912323
46 2.7107717990875244
47 2.677427053451538
48 2.6449227333068848
49 2.61322283744812
50 2.5823092460632324
51 2.5521597862243652
52 2.5227556228637695
53 2.494032382965088
54 2.4659388065338135
55 2.438509702682495
56 2.4117209911346436
57 2.3855485916137695
58 2.359961986541748
59 2.3349292278289795
60 2.3104190826416016
61 2.286430597305298
62 2.2629616260528564
63 2.2399888038635254
64 2.217496871948242
65 2.1954705715179443
66 2.173887252807617
67 2.1527469158172607
68 2.132028579711914
69 2.111713409423828
70 2.0917868614196777
71 2.0722455978393555
72 2.053072214126587
73 2.0342624187469482
74 2.0158121585845947
75 1.9977023601531982
76 1.9798957109451294
77 1.9624133110046387
78 1.9452464580535889
79 1.9283857345581055
80 1.9118256568908691
81 1.895555019378662
82 1.8795545101165771
83 1.8638136386871338
84 1.8483275175094604
85 1.8330997228622437
86 1.8181283473968506
87 1.8034056425094604
88 1.7889187335968018
89 1.7746583223342896
90 1.7606290578842163
91 1.7468219995498657
92 1.7332323789596558
93 1.719851016998291
94 1.7066792249679565
95 1.6937110424041748
96 1.680941104888916
97 1.6683626174926758
98 1.6559748649597168
99 1.6437857151031494
100 1.6317881345748901
101 1.6199665069580078
102 1.6082994937896729
103 1.5968005657196045
104 1.5854676961898804
105 1.5743000507354736
106 1.5632870197296143
107 1.5524123907089233
108 1.5416890382766724
109 1.5311145782470703
110 1.5206801891326904
111 1.5103785991668701
112 1.5002166032791138
113 1.490191102027893
114 1.4802982807159424
115 1.4705349206924438
116 1.460896611213684
117 1.4513828754425049
118 1.4419926404953003
119 1.4327237606048584
120 1.4235708713531494
121 1.414528250694275
122 1.4056001901626587
123 1.396785855293274
124 1.3880841732025146
125 1.3794904947280884
126 1.3710025548934937
127 1.3626171350479126
128 1.3543349504470825
129 1.3461530208587646
130 1.3380697965621948
131 1.3300824165344238
132 1.3221898078918457
133 1.3143912553787231
134 1.3066872358322144
135 1.2990696430206299
136 1.2915362119674683
137 1.2840895652770996
138 1.2767281532287598
139 1.2694509029388428
140 1.2622557878494263
141 1.2551392316818237
142 1.248098373413086
143 1.241135597229004
144 1.2342497110366821
145 1.2274389266967773
146 1.2206966876983643
147 1.2140265703201294
148 1.2074280977249146
149 1.200900912284851
150 1.1944435834884644
151 1.1880548000335693
152 1.1817348003387451
153 1.1754802465438843
154 1.169293999671936
155 1.1631759405136108
156 1.1571210622787476
157 1.151126742362976
158 1.14518404006958
159 1.1393033266067505
160 1.1334810256958008
161 1.1277174949645996
162 1.1220113039016724
163 1.1163551807403564
164 1.1107507944107056
165 1.105201244354248
166 1.0997103452682495
167 1.0942801237106323
168 1.0889040231704712
169 1.083583950996399
170 1.078315258026123
171 1.0730972290039062
172 1.067928671836853
173 1.0628070831298828
174 1.0577337741851807
175 1.0527089834213257
176 1.047730565071106
177 1.042799472808838
178 1.0379137992858887
179 1.0330678224563599
180 1.0282684564590454
181 1.0235164165496826
182 1.0188075304031372
183 1.0141416788101196
184 1.0095181465148926
185 1.0049362182617188
186 1.000395655632019
187 0.9958955645561218
188 0.9914366006851196
189 0.9870172142982483
190 0.982637345790863
191 0.9782963991165161
192 0.9739935994148254
193 0.9697280526161194
194 0.9654982089996338
195 0.9613040685653687
196 0.9571464657783508
197 0.9530245065689087
198 0.9489384293556213
199 0.9448872208595276
200 0.9408681988716125
201 0.936881422996521
202 0.9329274296760559
203 0.9290053844451904
204 0.925116777420044
205 0.9212581515312195
206 0.9174230694770813
207 0.9136195182800293
208 0.9098472595214844
209 0.9061055183410645
210 0.90239417552948
211 0.898712694644928
212 0.8950610160827637
213 0.8914390206336975
214 0.8878458738327026
215 0.8842823505401611
216 0.8807358145713806
217 0.8772179484367371
218 0.8737277388572693
219 0.8702632188796997
220 0.8668240308761597
221 0.8634112477302551
222 0.86002516746521
223 0.8566659688949585
224 0.85333251953125
225 0.8500242829322815
226 0.8467413187026978
227 0.8434829115867615
228 0.8402493596076965
229 0.8370398879051208
230 0.833854615688324
231 0.8306929469108582
232 0.827552080154419
233 0.8244278430938721
234 0.8213253021240234
235 0.8182432651519775
236 0.8151854872703552
237 0.8121501803398132
238 0.8091368675231934
239 0.8061456680297852
240 0.8031761050224304
241 0.80022794008255
242 0.7973006963729858
243 0.7943944334983826
244 0.7915089726448059
245 0.7886437773704529
246 0.7857990860939026
247 0.7829744219779968
248 0.7801696062088013
249 0.777384340763092
250 0.7746186852455139
251 0.7718718647956848
252 0.7691439390182495
253 0.7664331793785095
254 0.7637409567832947
255 0.7610670924186707
256 0.7584114670753479
257 0.7557739615440369
258 0.7531542778015137
259 0.7505530118942261
260 0.7479691505432129
261 0.7454013824462891
262 0.7428488731384277
263 0.7403134703636169
264 0.7377946376800537
265 0.7352923154830933
266 0.7328044772148132
267 0.7303316593170166
268 0.7278748154640198
269 0.7254340648651123
270 0.7230089902877808
271 0.7205997705459595
272 0.7182057499885559
273 0.7158268094062805
274 0.7134631276130676
275 0.7111144661903381
276 0.7087805867195129
277 0.706461489200592
278 0.7041566967964172
279 0.7018659710884094
280 0.6995895504951477
281 0.6973273754119873
282 0.6950793266296387
283 0.6928450465202332
284 0.6906245350837708
285 0.6884176135063171
286 0.6862243413925171
287 0.684044599533081
288 0.681877851486206
289 0.6797241568565369
290 0.6775832176208496
291 0.6754552721977234
292 0.6733400225639343
293 0.6712375283241272
294 0.6691456437110901
295 0.6670605540275574
296 0.6649874448776245
297 0.6629267930984497
298 0.6608779430389404
299 0.6588414311408997
300 0.6568167805671692
301 0.6548037528991699
302 0.6528029441833496
303 0.6508134007453918
304 0.648834228515625
305 0.6468671560287476
306 0.6449111700057983
307 0.6429666876792908
308 0.6410335302352905
309 0.6391113996505737
310 0.6372001767158508
311 0.6352999210357666
312 0.6334100961685181
313 0.6315310001373291
314 0.6296621561050415
315 0.6278043389320374
316 0.6259567141532898
317 0.6241182684898376
318 0.6222867369651794
319 0.6204652786254883
320 0.6186535358428955
321 0.6168522834777832
322 0.6150606870651245
323 0.6132790446281433
324 0.61150723695755
325 0.6097429990768433
326 0.6079882979393005
327 0.6062430143356323
328 0.604507327079773
329 0.6027806997299194
330 0.6010634899139404
331 0.5993567109107971
332 0.5976606011390686
333 0.5959730744361877
334 0.5942966938018799
335 0.592631459236145
336 0.590974748134613
337 0.589325487613678
338 0.5876843929290771
339 0.5860518217086792
340 0.5844278335571289
341 0.5828127264976501
342 0.5812069177627563
343 0.5796093940734863
344 0.578019917011261
345 0.5764382481575012
346 0.5748651027679443
347 0.5732994079589844
348 0.5717419981956482
349 0.5701925754547119
350 0.5686507225036621
351 0.5671167373657227
352 0.5655908584594727
353 0.5640721321105957
354 0.56256103515625
355 0.5610572695732117
356 0.5595608949661255
357 0.5580719709396362
358 0.5565890073776245
359 0.5551137924194336
360 0.5536459684371948
361 0.5521852970123291
362 0.5507314801216125
363 0.5492850542068481
364 0.5478458404541016
365 0.5464138984680176
366 0.5449886918067932
367 0.5435706973075867
368 0.5421592593193054
369 0.5407543778419495
370 0.5393574237823486
371 0.5379711985588074
372 0.5365912318229675
373 0.5352166891098022
374 0.5338472127914429
375 0.5324841141700745
376 0.5311273336410522
377 0.5297768712043762
378 0.5284324288368225
379 0.5270943641662598
380 0.5257622003555298
381 0.5244361758232117
382 0.5231162309646606
383 0.5218020081520081
384 0.5204949378967285
385 0.5191943645477295
386 0.5178999304771423
387 0.516610324382782
388 0.5153262615203857
389 0.514048159122467
390 0.5127755403518677
391 0.5115087032318115
392 0.510247528553009
393 0.5089918375015259
394 0.5077416896820068
395 0.5064969062805176
396 0.5052579045295715
397 0.5040238499641418
398 0.5027949213981628
399 0.5015710592269897
400 0.5003527402877808
401 0.4991387724876404
402 0.49793052673339844
403 0.49672847986221313
404 0.49553149938583374
405 0.4943397045135498
406 0.49315300583839417
407 0.4919714331626892
408 0.4907943904399872
409 0.48962166905403137
410 0.4884537160396576
411 0.48729056119918823
412 0.48613253235816956
413 0.4849790930747986
414 0.4838304817676544
415 0.48268669843673706
416 0.48154768347740173
417 0.4804132282733917
418 0.47928348183631897
419 0.4781581461429596
420 0.47703754901885986
421 0.4759214520454407
422 0.47481003403663635
423 0.47370296716690063
424 0.4725983440876007
425 0.4714982807636261
426 0.4704027771949768
427 0.46931177377700806
428 0.4682251513004303
429 0.4671427607536316
430 0.4660649299621582
431 0.4649914503097534
432 0.4639223515987396
433 0.4628574252128601
434 0.4617968797683716
435 0.46074098348617554
436 0.459689199924469
437 0.4586414396762848
438 0.45759811997413635
439 0.4565587043762207
440 0.45552361011505127
441 0.45449256896972656
442 0.4534655213356018
443 0.4524424970149994
444 0.45142364501953125
445 0.4504084885120392
446 0.4493974447250366
447 0.44839024543762207
448 0.44738712906837463
449 0.4463876485824585
450 0.44539159536361694
451 0.4443994164466858
452 0.44341111183166504
453 0.44242650270462036
454 0.44144588708877563
455 0.4404689371585846
456 0.4394948482513428
457 0.43852463364601135
458 0.43755805492401123
459 0.43659508228302
460 0.4356358051300049
461 0.43468010425567627
462 0.4337283670902252
463 0.43278005719184875
464 0.4318353235721588
465 0.4308941066265106
466 0.4299563765525818
467 0.42902228236198425
468 0.4280916154384613
469 0.4271644651889801
470 0.42624062299728394
471 0.4253202974796295
472 0.42440328001976013
473 0.4234898090362549
474 0.42257949709892273
475 0.421672523021698
476 0.4207688570022583
477 0.4198683500289917
478 0.41897115111351013
479 0.41807663440704346
480 0.41718536615371704
481 0.4162973165512085
482 0.4154125154018402
483 0.41453102231025696
484 0.41365253925323486
485 0.41277727484703064
486 0.41190510988235474
487 0.41103610396385193
488 0.41017037630081177
489 0.4093076288700104
490 0.4084479808807373
491 0.40759146213531494
492 0.40673789381980896
493 0.4058874845504761
494 0.4050399959087372
495 0.4041954278945923
496 0.4033539593219757
497 0.40251532196998596
498 0.4016796350479126
499 0.4008469879627228
500 0.400017112493515
501 0.3991903066635132
502 0.3983662724494934
503 0.3975451588630676
504 0.39672714471817017
505 0.3959120512008667
506 0.39509981870651245
507 0.3942902386188507
508 0.3934839367866516
509 0.3926803469657898
510 0.3918794095516205
511 0.3910813629627228
512 0.39028578996658325
513 0.38949301838874817
514 0.38870301842689514
515 0.38791561126708984
516 0.38713082671165466
517 0.38634881377220154
518 0.38556939363479614
519 0.3847925662994385
520 0.3840184211730957
521 0.3832467794418335
522 0.38247784972190857
523 0.38171136379241943
524 0.3809475898742676
525 0.3801862299442291
526 0.37942740321159363
527 0.37867122888565063
528 0.37791767716407776
529 0.3771663308143616
530 0.3764175772666931
531 0.3756713569164276
532 0.3749275207519531
533 0.3741862177848816
534 0.37344735860824585
535 0.3727108836174011
536 0.37197694182395935
537 0.37124574184417725
538 0.37051692605018616
539 0.3697904646396637
540 0.36906641721725464
541 0.3683447241783142
542 0.36762523651123047
543 0.36690813302993774
544 0.3661934733390808
545 0.36548087000846863
546 0.3647707402706146
547 0.3640628755092621
548 0.36335721611976624
549 0.36265382170677185
550 0.3619528114795685
551 0.3612537384033203
552 0.3605571687221527
553 0.3598625659942627
554 0.35917040705680847
555 0.3584803342819214
556 0.35779252648353577
557 0.3571068048477173
558 0.35642334818840027
559 0.3557418882846832
560 0.35506269335746765
561 0.35438546538352966
562 0.353710412979126
563 0.3530370891094208
564 0.35236597061157227
565 0.3516969680786133
566 0.3510299623012543
567 0.35036501288414
568 0.34970200061798096
569 0.3490402102470398
570 0.34838053584098816
571 0.3477229177951813
572 0.34706732630729675
573 0.34641364216804504
574 0.34576213359832764
575 0.34511247277259827
576 0.3444649577140808
577 0.34381937980651855
578 0.34317582845687866
579 0.34253424406051636
580 0.3418945372104645
581 0.34125688672065735
582 0.3406212031841278
583 0.3399873375892639
584 0.3393557071685791
585 0.338725745677948
586 0.3380976617336273
587 0.33747154474258423
588 0.33684736490249634
589 0.3362250328063965
590 0.3356045186519623
591 0.33498600125312805
592 0.3343692123889923
593 0.33375415205955505
594 0.33314049243927
595 0.33252862095832825
596 0.3319186568260193
597 0.3313103914260864
598 0.3307040333747864
599 0.3300994634628296
600 0.32949671149253845
601 0.32889577746391296
602 0.3282965123653412
603 0.32769909501075745
604 0.32710346579551697
605 0.3265095353126526
606 0.32591742277145386
607 0.3253268897533417
608 0.3247380256652832
609 0.3241502046585083
610 0.32356417179107666
611 0.32297977805137634
612 0.3223971426486969
613 0.3218162953853607
614 0.3212369978427887
615 0.3206595480442047
616 0.32008370757102966
617 0.3195095658302307
618 0.3189369738101959
619 0.318366140127182
620 0.3177970349788666
621 0.31722941994667053
622 0.316663533449173
623 0.3160993456840515
624 0.31553658843040466
625 0.3149755597114563
626 0.3144153654575348
627 0.3138548731803894
628 0.3132960796356201
629 0.312738835811615
630 0.312183141708374
631 0.31162911653518677
632 0.31107667088508606
633 0.31052571535110474
634 0.30997610092163086
635 0.30942797660827637
636 0.3088814616203308
637 0.3083364963531494
638 0.30779314041137695
639 0.3072512745857239
640 0.3067108690738678
641 0.30617210268974304
642 0.3056347668170929
643 0.3050990104675293
644 0.3045649230480194
645 0.3040320575237274
646 0.30350086092948914
647 0.30297088623046875
648 0.3024422526359558
649 0.30191442370414734
650 0.30138808488845825
651 0.30086323618888855
652 0.30033981800079346
653 0.299817830324173
654 0.29929739236831665
655 0.29877838492393494
656 0.29826080799102783
657 0.2977446913719177
658 0.29722997546195984
659 0.2967166602611542
660 0.2962048351764679
661 0.29569441080093384
662 0.295185387134552
663 0.2946777641773224
664 0.2941715717315674
665 0.29366669058799744
666 0.29316312074661255
667 0.29266107082366943
668 0.29216039180755615
669 0.29166102409362793
670 0.29116299748420715
671 0.290666401386261
672 0.2901711165904999
673 0.2896771728992462
674 0.2891846001148224
675 0.2886933982372284
676 0.28820356726646423
677 0.28771504759788513
678 0.2872277796268463
679 0.28674182295799255
680 0.28625720739364624
681 0.285773903131485
682 0.2852921485900879
683 0.2848118543624878
684 0.2843327522277832
685 0.2838551998138428
686 0.28337860107421875
687 0.28290292620658875
688 0.28242799639701843
689 0.2819541394710541
690 0.28148168325424194
691 0.2810104191303253
692 0.28054022789001465
693 0.2800716459751129
694 0.2796040177345276
695 0.2791377305984497
696 0.2786727845668793
697 0.2782088816165924
698 0.2777462303638458
699 0.27728477120399475
700 0.2768246531486511
701 0.2763655483722687
702 0.2759076952934265
703 0.27545082569122314
704 0.2749953269958496
705 0.2745409905910492
706 0.2740877866744995
707 0.2736358940601349
708 0.27318498492240906
709 0.27273547649383545
710 0.27228716015815735
711 0.2718399167060852
712 0.2713938057422638
713 0.2709488272666931
714 0.27050501108169556
715 0.27006229758262634
716 0.26962071657180786
717 0.2691802978515625
718 0.26874107122421265
719 0.268302857875824
720 0.26786598563194275
721 0.2674299478530884
722 0.26699501276016235
723 0.2665612995624542
724 0.26612868905067444
725 0.2656972110271454
726 0.2652667760848999
727 0.2648375630378723
728 0.26440927386283875
729 0.263982355594635
730 0.2635563313961029
731 0.26313140988349915
732 0.26270776987075806
733 0.2622849643230438
734 0.2618633508682251
735 0.2614428400993347
736 0.26102331280708313
737 0.26060497760772705
738 0.26018744707107544
739 0.25977039337158203
740 0.2593544125556946
741 0.2589395344257355
742 0.2585256099700928
743 0.2581128776073456
744 0.25770118832588196
745 0.257290780544281
746 0.25688138604164124
747 0.25647300481796265
748 0.2560657560825348
749 0.2556595504283905
750 0.2552543580532074
751 0.25485020875930786
752 0.25444698333740234
753 0.25404489040374756
754 0.2536437511444092
755 0.2532435953617096
756 0.2528444826602936
757 0.25244635343551636
758 0.2520492374897003
759 0.2516530752182007
760 0.25125786662101746
761 0.2508637309074402
762 0.25047051906585693
763 0.2500782907009125
764 0.2496870458126068
765 0.24929681420326233
766 0.24890746176242828
767 0.2485191524028778
768 0.24813173711299896
769 0.24774526059627533
770 0.24735982716083527
771 0.24697533249855042
772 0.2465917468070984
773 0.24620908498764038
774 0.24582739174365997
775 0.24544666707515717
776 0.24506676197052002
777 0.24468792974948883
778 0.24431000649929047
779 0.24393302202224731
780 0.2435569167137146
781 0.2431817650794983
782 0.24280714988708496
783 0.2424321323633194
784 0.24205803871154785
785 0.24168488383293152
786 0.241312637925148
787 0.24094131588935852
788 0.2405707836151123
789 0.24020111560821533
790 0.23983235657215118
791 0.23946447670459747
792 0.2390974760055542
793 0.23873141407966614
794 0.23836618661880493
795 0.23800188302993774
796 0.2376384139060974
797 0.2372758835554123
798 0.23691418766975403
799 0.23655343055725098
800 0.23619335889816284
801 0.23583398759365082
802 0.23547561466693878
803 0.2351180762052536
804 0.23476144671440125
805 0.23440562188625336
806 0.2340506911277771
807 0.2336966097354889
808 0.23334330320358276
809 0.23299092054367065
810 0.23263929784297943
811 0.23228861391544342
812 0.23193864524364471
813 0.23158957064151764
814 0.2312413603067398
815 0.23089389503002167
816 0.2305472046136856
817 0.23020081222057343
818 0.22985512018203735
819 0.2295103222131729
820 0.2291664332151413
821 0.22882342338562012
822 0.2284812480211258
823 0.22813984751701355
824 0.22779929637908936
825 0.22745954990386963
826 0.22712060809135437
827 0.22678242623806
828 0.2264450341463089
829 0.22610844671726227
830 0.2257726937532425
831 0.22543774545192719
832 0.22510360181331635
833 0.22477015852928162
834 0.22443754971027374
835 0.22410567104816437
836 0.22377459704875946
837 0.22344429790973663
838 0.2231147140264511
839 0.22278596460819244
840 0.22245793044567108
841 0.2221306562423706
842 0.2218042016029358
843 0.22147847712039948
844 0.22115345299243927
845 0.2208293080329895
846 0.22050583362579346
847 0.22018328309059143
848 0.2198614776134491
849 0.21954035758972168
850 0.21922007203102112
851 0.21890050172805786
852 0.2185816466808319
853 0.21826350688934326
854 0.2179461419582367
855 0.21762950718402863
856 0.2173135131597519
857 0.21699830889701843
858 0.21668371558189392
859 0.21636976301670074
860 0.2160564661026001
861 0.21574383974075317
862 0.2154320776462555
863 0.21512089669704437
864 0.21481050550937653
865 0.2145007699728012
866 0.2141917645931244
867 0.2138834446668625
868 0.2135758399963379
869 0.21326890587806702
870 0.21296268701553345
871 0.2126571536064148
872 0.21235229074954987
873 0.21204814314842224
874 0.21174462139606476
875 0.21144187450408936
876 0.21113982796669006
877 0.21083839237689972
878 0.2105376124382019
879 0.2102375030517578
880 0.20993809401988983
881 0.20963941514492035
882 0.2093413919210434
883 0.209043949842453
884 0.20874719321727753
885 0.20845113694667816
886 0.2081557810306549
887 0.20786099135875702
888 0.20756688714027405
889 0.2072734236717224
890 0.20698067545890808
891 0.206688791513443
892 0.2063976228237152
893 0.20610706508159637
894 0.20581713318824768
895 0.20552785694599152
896 0.2052392065525055
897 0.2049512267112732
898 0.20466385781764984
899 0.20437712967395782
900 0.20409105718135834
901 0.2038055807352066
902 0.203520730137825
903 0.20323652029037476
904 0.20295295119285583
905 0.20266996324062347
906 0.20238761603832245
907 0.2021058350801468
908 0.20182473957538605
909 0.20154424011707306
910 0.20126426219940186
911 0.20098499953746796
912 0.20070627331733704
913 0.20042823255062103
914 0.20015071332454681
915 0.19987383484840393
916 0.1995975524187088
917 0.1993219256401062
918 0.19904685020446777
919 0.1987723708152771
920 0.19849847257137299
921 0.198225200176239
922 0.1979524940252304
923 0.19768035411834717
924 0.19740886986255646
925 0.19713793694972992
926 0.19686803221702576
927 0.19659921526908875
928 0.19633109867572784
929 0.19606339931488037
930 0.19579637050628662
931 0.19552987813949585
932 0.19526398181915283
933 0.1949986219406128
934 0.1947338581085205
935 0.1944696456193924
936 0.19420598447322845
937 0.19394290447235107
938 0.19368033111095428
939 0.19341830909252167
940 0.1931568831205368
941 0.1928960531949997
942 0.19263578951358795
943 0.19237612187862396
944 0.19211700558662415
945 0.19185851514339447
946 0.19160042703151703
947 0.19134299457073212
948 0.1910860240459442
949 0.19082963466644287
950 0.1905737668275833
951 0.19031842052936554
952 0.19006362557411194
953 0.18980936706066132
954 0.18955565989017487
955 0.18930241465568542
956 0.18904975056648254
957 0.18879757821559906
958 0.18854588270187378
959 0.18829473853111267
960 0.18804407119750977
961 0.18779391050338745
962 0.18754436075687408
963 0.187295600771904
964 0.18704718351364136
965 0.18679922819137573
966 0.18655185401439667
967 0.1863049566745758
968 0.18605859577655792
969 0.18581271171569824
970 0.18556730449199677
971 0.18532252311706543
972 0.18507833778858185
973 0.18483464419841766
974 0.18459144234657288
975 0.18434876203536987
976 0.18410657346248627
977 0.18386483192443848
978 0.18362365663051605
979 0.18338292837142944
980 0.18314272165298462
981 0.182902991771698
982 0.18266382813453674
983 0.1824250966310501
984 0.18218687176704407
985 0.1819491684436798
986 0.18171188235282898
987 0.18147514760494232
988 0.1812387853860855
989 0.18100300431251526
990 0.18076765537261963
991 0.1805328130722046
992 0.18029844760894775
993 0.18006452918052673
994 0.17983104288578033
995 0.1795981228351593
996 0.17936556041240692
997 0.1791335493326187
998 0.17890198528766632
999 0.17867091298103333
1000 0.17844027280807495
1001 0.17821010947227478
1002 0.17798036336898804
1003 0.17775122821331024
1004 0.17752258479595184
1005 0.17729438841342926
1006 0.17706671357154846
1007 0.1768394112586975
1008 0.17661258578300476
1009 0.17638623714447021
1010 0.1761602908372879
1011 0.175934836268425
1012 0.1757097989320755
1013 0.1754852533340454
1014 0.17526109516620636
1015 0.17503735423088074
1016 0.1748141199350357
1017 0.1745913177728653
1018 0.1743689626455307
1019 0.17414706945419312
1020 0.17392563819885254
1021 0.17370463907718658
1022 0.17348413169384003
1023 0.17326393723487854
1024 0.17304421961307526
1025 0.17282505333423615
1026 0.17260615527629852
1027 0.1723877340555191
1028 0.1721697598695755
1029 0.1719522327184677
1030 0.17173504829406738
1031 0.17151843011379242
1032 0.1713021695613861
1033 0.17108628153800964
1034 0.1708708554506302
1035 0.17065584659576416
1036 0.17044125497341156
1037 0.17022709548473358
1038 0.17001330852508545
1039 0.1698000133037567
1040 0.16958706080913544
1041 0.16937461495399475
1042 0.16916249692440033
1043 0.16895075142383575
1044 0.16873957216739655
1045 0.16852864623069763
1046 0.1683182269334793
1047 0.16810812056064606
1048 0.167898491024971
1049 0.1676892787218094
1050 0.16748040914535522
1051 0.16727200150489807
1052 0.167063906788826
1053 0.1668563187122345
1054 0.16664905846118927
1055 0.16644226014614105
1056 0.1662357896566391
1057 0.16602975130081177
1058 0.16582410037517548
1059 0.165618896484375
1060 0.1654140204191208
1061 0.1652095466852188
1062 0.16500544548034668
1063 0.16480176150798798
1064 0.16459845006465912
1065 0.16439560055732727
1066 0.1641930788755417
1067 0.16399094462394714
1068 0.16378919780254364
1069 0.16358785331249237
1070 0.16338680684566498
1071 0.1631862372159958
1072 0.16298598051071167
1073 0.16278617084026337
1074 0.16258670389652252
1075 0.16238759458065033
1076 0.16218891739845276
1077 0.16199055314064026
1078 0.1617925763130188
1079 0.16159503161907196
1080 0.1613977998495102
1081 0.16120091080665588
1082 0.16100440919399261
1083 0.16080829501152039
1084 0.16061246395111084
1085 0.1604166328907013
1086 0.16022120416164398
1087 0.1600261628627777
1088 0.15983138978481293
1089 0.15963706374168396
1090 0.15944314002990723
1091 0.1592494547367096
1092 0.15905626118183136
1093 0.15886341035366058
1094 0.15867087244987488
1095 0.15847866237163544
1096 0.1582869440317154
1097 0.15809547901153564
1098 0.15790443122386932
1099 0.15771366655826569
1100 0.15752333402633667
1101 0.15733331441879272
1102 0.157143697142601
1103 0.15695440769195557
1104 0.1567654311656952
1105 0.15657684206962585
1106 0.15638858079910278
1107 0.15620073676109314
1108 0.15601316094398499
1109 0.15582597255706787
1110 0.1556389033794403
1111 0.1554522067308426
1112 0.15526588261127472
1113 0.15507985651493073
1114 0.1548941731452942
1115 0.15470890700817108
1116 0.15452396869659424
1117 0.15433931350708008
1118 0.15415504574775696
1119 0.15397107601165771
1120 0.1537875086069107
1121 0.15360425412654877
1122 0.1534212827682495
1123 0.15323872864246368
1124 0.15305647253990173
1125 0.15287454426288605
1126 0.15269294381141663
1127 0.15251168608665466
1128 0.15233080089092255
1129 0.1521502286195755
1130 0.15196996927261353
1131 0.15179002285003662
1132 0.15161044895648956
1133 0.1514311581850052
1134 0.15125229954719543
1135 0.1510736644268036
1136 0.1508954018354416
1137 0.15071742236614227
1138 0.1505398452281952
1139 0.1503625214099884
1140 0.15018555521965027
1141 0.15000887215137482
1142 0.1498325765132904
1143 0.1496565043926239
1144 0.14948081970214844
1145 0.14930547773838043
1146 0.1491304337978363
1147 0.14895570278167725
1148 0.14878123998641968
1149 0.14860711991786957
1150 0.14843332767486572
1151 0.14825992286205292
1152 0.14808668196201324
1153 0.14791381359100342
1154 0.14774130284786224
1155 0.14756906032562256
1156 0.14739717543125153
1157 0.1472255438566208
1158 0.1470542550086975
1159 0.1468832641839981
1160 0.1467125117778778
1161 0.14654211699962616
1162 0.1463719755411148
1163 0.14620213210582733
1164 0.14603258669376373
1165 0.1458633542060852
1166 0.14569441974163055
1167 0.14552582800388336
1168 0.14535748958587646
1169 0.14518944919109344
1170 0.14502178132534027
1171 0.1448543220758438
1172 0.1446872353553772
1173 0.14452041685581207
1174 0.14435391128063202
1175 0.14418786764144897
1176 0.14402209222316742
1177 0.14385661482810974
1178 0.14369145035743713
1179 0.14352655410766602
1180 0.14336198568344116
1181 0.14319771528244019
1182 0.1430336982011795
1183 0.1428699493408203
1184 0.1427065134048462
1185 0.14254343509674072
1186 0.14238065481185913
1187 0.14221802353858948
1188 0.1420557200908661
1189 0.14189375936985016
1190 0.14173205196857452
1191 0.14157067239284515
1192 0.1414095163345337
1193 0.1412486582994461
1194 0.1410880982875824
1195 0.14092782139778137
1196 0.14076772332191467
1197 0.14060793817043304
1198 0.1404484063386917
1199 0.14028921723365784
1200 0.14013022184371948
1201 0.13997159898281097
1202 0.1398131549358368
1203 0.1396550089120865
1204 0.13949717581272125
1205 0.1393395960330963
1206 0.13918231427669525
1207 0.13902528584003448
1208 0.1388685405254364
1209 0.1387120485305786
1210 0.1385558396577835
1211 0.1383998841047287
1212 0.13824424147605896
1213 0.13808883726596832
1214 0.13793377578258514
1215 0.1377788931131363
1216 0.13762430846691132
1217 0.13746996223926544
1218 0.13731595873832703
1219 0.13716216385364532
1220 0.1370086520910263
1221 0.1368553787469864
1222 0.13670241832733154
1223 0.13654965162277222
1224 0.13639724254608154
1225 0.13624504208564758
1226 0.13609308004379272
1227 0.13594146072864532
1228 0.13579006493091583
1229 0.13563887774944305
1230 0.13548794388771057
1231 0.13533736765384674
1232 0.135187029838562
1233 0.13503684103488922
1234 0.13488703966140747
1235 0.13473738729953766
1236 0.1345881074666977
1237 0.13443899154663086
1238 0.1342901587486267
1239 0.13414162397384644
1240 0.1339932531118393
1241 0.13384521007537842
1242 0.13369734585285187
1243 0.13354985415935516
1244 0.1334024965763092
1245 0.13325542211532593
1246 0.13310864567756653
1247 0.13296210765838623
1248 0.13281574845314026
1249 0.13266970217227936
1250 0.13252387940883636
1251 0.13237830996513367
1252 0.13223296403884888
1253 0.13208791613578796
1254 0.13194309175014496
1255 0.13179850578308105
1256 0.13165414333343506
1257 0.13151007890701294
1258 0.13136617839336395
1259 0.13122263550758362
1260 0.13107924163341522
1261 0.13093607127666473
1262 0.1307932287454605
1263 0.1306505799293518
1264 0.1305081695318222
1265 0.1303660124540329
1266 0.13022403419017792
1267 0.1300821751356125
1268 0.12994055449962616
1269 0.12979915738105774
1270 0.12965798377990723
1271 0.1295170783996582
1272 0.1293763667345047
1273 0.12923593819141388
1274 0.12909570336341858
1275 0.12895573675632477
1276 0.12881596386432648
1277 0.12867645919322968
1278 0.12853717803955078
1279 0.1283981204032898
1280 0.1282593309879303
1281 0.12812073528766632
1282 0.12798242270946503
1283 0.12784431874752045
1284 0.1277063935995102
1285 0.127568781375885
1286 0.12743127346038818
1287 0.12729410827159882
1288 0.12715716660022736
1289 0.12702038884162903
1290 0.12688389420509338
1291 0.12674753367900848
1292 0.12661126255989075
1293 0.12647520005702972
1294 0.126339390873909
1295 0.12620379030704498
1296 0.12606842815876007
1297 0.12593327462673187
1298 0.12579837441444397
1299 0.1256636679172516
1300 0.1255291998386383
1301 0.12539489567279816
1302 0.12526090443134308
1303 0.12512709200382233
1304 0.12499348819255829
1305 0.12486009299755096
1306 0.12472695857286453
1307 0.12459396570920944
1308 0.12446130067110062
1309 0.12432881444692612
1310 0.12419652938842773
1311 0.12406444549560547
1312 0.12393259257078171
1313 0.12380097061395645
1314 0.1236695647239685
1315 0.12353835999965668
1316 0.12340737134218216
1317 0.12327660620212555
1318 0.12314607203006744
1319 0.12301567196846008
1320 0.12288554757833481
1321 0.12275562435388565
1322 0.12262595444917679
1323 0.12249641120433807
1324 0.12236712127923965
1325 0.12223801761865616
1326 0.12210913002490997
1327 0.12198052555322647
1328 0.12185252457857132
1329 0.12172474712133408
1330 0.12159716337919235
1331 0.12146978825330734
1332 0.12134263664484024
1333 0.12121567130088806
1334 0.1210889145731926
1335 0.12096235901117325
1336 0.12083601206541061
1337 0.1207098588347435
1338 0.1205839142203331
1339 0.12045816332101822
1340 0.12033262848854065
1341 0.1202072948217392
1342 0.12008214741945267
1343 0.11995719373226166
1344 0.11983244866132736
1345 0.11970794200897217
1346 0.11958358436822891
1347 0.11945943534374237
1348 0.11933547258377075
1349 0.11921171098947525
1350 0.11908816546201706
1351 0.1189647987484932
1352 0.11884165555238724
1353 0.11871868371963501
1354 0.11859589070081711
1355 0.11847332864999771
1356 0.11835097521543503
1357 0.11822876334190369
1358 0.11810675263404846
1359 0.11798498779535294
1360 0.11786338686943054
1361 0.11774194985628128
1362 0.11762074381113052
1363 0.11749971657991409
1364 0.11737888306379318
1365 0.11725825071334839
1366 0.11713778227567673
1367 0.11701751500368118
1368 0.11689744144678116
1369 0.11677756160497665
1370 0.11665784567594528
1371 0.11653835326433182
1372 0.11641902476549149
1373 0.11629988998174667
1374 0.11618093401193619
1375 0.11606216430664062
1376 0.11594359576702118
1377 0.11582519859075546
1378 0.11570696532726288
1379 0.11558897793292999
1380 0.11547113955020905
1381 0.11535345762968063
1382 0.11523600667715073
1383 0.11511873453855515
1384 0.1150016337633133
1385 0.11488468945026398
1386 0.11476796865463257
1387 0.11465141922235489
1388 0.11453505605459213
1389 0.11441881954669952
1390 0.11430282145738602
1391 0.11418697983026505
1392 0.11407133936882019
1393 0.11395584791898727
1394 0.11384056508541107
1395 0.1137254610657692
1396 0.11361052095890045
1397 0.11349573731422424
1398 0.11338115483522415
1399 0.11326676607131958
1400 0.11315250396728516
1401 0.11303848028182983
1402 0.11292459815740585
1403 0.1128108948469162
1404 0.11269736289978027
1405 0.11258402466773987
1406 0.11247087270021439
1407 0.11235783249139786
1408 0.11224502325057983
1409 0.11213236302137375
1410 0.11201989650726318
1411 0.11190757155418396
1412 0.11179544776678085
1413 0.11168348044157028
1414 0.11157166957855225
1415 0.11146008223295212
1416 0.11134861409664154
1417 0.11123731732368469
1418 0.11112622171640396
1419 0.11101529002189636
1420 0.1109045073390007
1421 0.11079391092061996
1422 0.11068347096443176
1423 0.11057321727275848
1424 0.11046311259269714
1425 0.11035320907831192
1426 0.11024343967437744
1427 0.1101338341832161
1428 0.11002441495656967
1429 0.10991514474153519
1430 0.10980602353811264
1431 0.10969708114862442
1432 0.10958831012248993
1433 0.10947973281145096
1434 0.10937126725912094
1435 0.10926298052072525
1436 0.10915490984916687
1437 0.10904694348573685
1438 0.10893915593624115
1439 0.10883153229951859
1440 0.10872406512498856
1441 0.10861678421497345
1442 0.10850962996482849
1443 0.10840266942977905
1444 0.10829587280750275
1445 0.1081891879439354
1446 0.10808269679546356
1447 0.10797636210918427
1448 0.1078702062368393
1449 0.10776418447494507
1450 0.10765834152698517
1451 0.10755262523889542
1452 0.10744712501764297
1453 0.1073417216539383
1454 0.10723649710416794
1455 0.10713144391775131
1456 0.10702652484178543
1457 0.10692178457975388
1458 0.10681717842817307
1459 0.10671275109052658
1460 0.10660844296216965
1461 0.10650433599948883
1462 0.10640034079551697
1463 0.10629652440547943
1464 0.10619286447763443
1465 0.10608939081430435
1466 0.10598602145910263
1467 0.10588280111551285
1468 0.1057797446846962
1469 0.10567683726549149
1470 0.10557413101196289
1471 0.10547152161598206
1472 0.10536909848451614
1473 0.10526682436466217
1474 0.10516467690467834
1475 0.10506269335746765
1476 0.10496088862419128
1477 0.10485922545194626
1478 0.10475768148899078
1479 0.10465629398822784
1480 0.10455509275197983
1481 0.10445396602153778
1482 0.10435308516025543
1483 0.10425227880477905
1484 0.1041516438126564
1485 0.10405115783214569
1486 0.10395081341266632
1487 0.10385064035654068
1488 0.10375059396028519
1489 0.10365073382854462
1490 0.10355096310377121
1491 0.10345136374235153
1492 0.10335192829370499
1493 0.1032525897026062
1494 0.10315343737602234
1495 0.10305443406105042
1496 0.10295553505420685
1497 0.10285680741071701
1498 0.10275823622941971
1499 0.10265979170799255
1500 0.10256149619817734
1501 0.10246330499649048
1502 0.10236532986164093
1503 0.10226747393608093
1504 0.10216976702213287
1505 0.10207217931747437
1506 0.1019747331738472
1507 0.10187742859125137
1508 0.10178026556968689
1509 0.10168324410915375
1510 0.10158638656139374
1511 0.10148965567350388
1512 0.10139306634664536
1513 0.10129660367965698
1514 0.10120029747486115
1515 0.10110414773225784
1516 0.10100807249546051
1517 0.1009121909737587
1518 0.10081644356250763
1519 0.10072082281112671
1520 0.10062536597251892
1521 0.10053001344203949
1522 0.1004347950220108
1523 0.10033977031707764
1524 0.10024480521678925
1525 0.10015002638101578
1526 0.10005535930395126
1527 0.09996084123849869
1528 0.09986648708581924
1529 0.09977219998836517
1530 0.09967811405658722
1531 0.09958413988351822
1532 0.09949029982089996
1533 0.09939657896757126
1534 0.09930300712585449
1535 0.09920957684516907
1536 0.0991162657737732
1537 0.09902310371398926
1538 0.09893004596233368
1539 0.09883716702461243
1540 0.09874437749385834
1541 0.09865174442529678
1542 0.09855923056602478
1543 0.09846687316894531
1544 0.0983746200799942
1545 0.09828250110149384
1546 0.09819051623344421
1547 0.09809868782758713
1548 0.0980069488286972
1549 0.09791538864374161
1550 0.09782389551401138
1551 0.09773258119821548
1552 0.09764137864112854
1553 0.09755028784275055
1554 0.0974591001868248
1555 0.09736804664134979
1556 0.09727711230516434
1557 0.09718629717826843
1558 0.09709559381008148
1559 0.09700504690408707
1560 0.09691460430622101
1561 0.09682430326938629
1562 0.09673413634300232
1563 0.0966440811753273
1564 0.09655417501926422
1565 0.0964643657207489
1566 0.09637469798326492
1567 0.0962851494550705
1568 0.09619574248790741
1569 0.09610645473003387
1570 0.09601729363203049
1571 0.09592825174331665
1572 0.09583933651447296
1573 0.09575054049491882
1574 0.09566189348697662
1575 0.09557335078716278
1576 0.09548492729663849
1577 0.09539663791656494
1578 0.09530849009752274
1579 0.09522047638893127
1580 0.09513253718614578
1581 0.09504476934671402
1582 0.09495709091424942
1583 0.09486954659223557
1584 0.09478212147951126
1585 0.0946948304772377
1586 0.09460766613483429
1587 0.09452063590288162
1588 0.09443371742963791
1589 0.09434691816568375
1590 0.09426026791334152
1591 0.09417371451854706
1592 0.09408729523420334
1593 0.09400097280740738
1594 0.09391479194164276
1595 0.0938287153840065
1596 0.09374275803565979
1597 0.09365694224834442
1598 0.093571238219738
1599 0.09348564594984055
1600 0.09340017288923264
1601 0.09331483393907547
1602 0.09322958439588547
1603 0.0931444764137268
1604 0.0930594950914383
1605 0.09297464042901993
1606 0.0928899273276329
1607 0.09280525892972946
1608 0.09272079914808273
1609 0.09263638406991959
1610 0.09255212545394897
1611 0.09246794879436493
1612 0.09238392114639282
1613 0.09229998290538788
1614 0.09221617132425308
1615 0.09213250130414963
1616 0.09204889833927155
1617 0.09196547418832779
1618 0.09188210219144821
1619 0.09179888665676117
1620 0.0917157530784607
1621 0.09163274616003036
1622 0.0915498435497284
1623 0.09146708250045776
1624 0.0913844108581543
1625 0.09130187332630157
1626 0.0912194475531578
1627 0.09113714098930359
1628 0.09105493128299713
1629 0.09097281843423843
1630 0.09089084714651108
1631 0.09080890566110611
1632 0.09072693437337875
1633 0.09064508974552155
1634 0.0905633419752121
1635 0.090481698513031
1636 0.09040015935897827
1637 0.09031878411769867
1638 0.09023749083280563
1639 0.09015630930662155
1640 0.09007523208856583
1641 0.08999425172805786
1642 0.08991342037916183
1643 0.08983268588781357
1644 0.08975204080343246
1645 0.08967151492834091
1646 0.08959110826253891
1647 0.08951077610254288
1648 0.08943060040473938
1649 0.08935051411390305
1650 0.08927052468061447
1651 0.08919064700603485
1652 0.08911088854074478
1653 0.08903125673532486
1654 0.08895169943571091
1655 0.0888722613453865
1656 0.08879294991493225
1657 0.08871372789144516
1658 0.08863460272550583
1659 0.08855560421943665
1660 0.08847669512033463
1661 0.08839784562587738
1662 0.08831913769245148
1663 0.08824051171541214
1664 0.08816199749708176
1665 0.08808359503746033
1666 0.08800531923770905
1667 0.08792711049318314
1668 0.08784902840852737
1669 0.08777104318141937
1670 0.08769319206476212
1671 0.0876154825091362
1672 0.08753781020641327
1673 0.08746029436588287
1674 0.08738285303115845
1675 0.08730556815862656
1676 0.08722832798957825
1677 0.0871511921286583
1678 0.08707418292760849
1679 0.08699724823236465
1680 0.08692044764757156
1681 0.08684378862380981
1682 0.08676714450120926
1683 0.08669066429138184
1684 0.08661424368619919
1685 0.08653797209262848
1686 0.08646176755428314
1687 0.08638568967580795
1688 0.08630971610546112
1689 0.08623381704092026
1690 0.08615804463624954
1691 0.0860823318362236
1692 0.086006760597229
1693 0.08593124151229858
1694 0.08585585653781891
1695 0.085780568420887
1696 0.08570535480976105
1697 0.08563025295734406
1698 0.08555526286363602
1699 0.08548037707805634
1700 0.08540556579828262
1701 0.08533086627721786
1702 0.08525626361370087
1703 0.08518175780773163
1704 0.08510735630989075
1705 0.08503304421901703
1706 0.08495887368917465
1707 0.08488474786281586
1708 0.08481074124574661
1709 0.08473682403564453
1710 0.084663026034832
1711 0.08458928018808365
1712 0.08451566845178604
1713 0.0844421461224556
1714 0.08436871320009232
1715 0.084295354783535
1716 0.08422215282917023
1717 0.08414900302886963
1718 0.08407595008611679
1719 0.0840030089020729
1720 0.08393016457557678
1721 0.08385739475488663
1722 0.08378473669290543
1723 0.08371218293905258
1724 0.08363966643810272
1725 0.0835673063993454
1726 0.08349502831697464
1727 0.08342280983924866
1728 0.08335065841674805
1729 0.08327868580818176
1730 0.08320675045251846
1731 0.08313492685556412
1732 0.08306317776441574
1733 0.08299154788255692
1734 0.08291999995708466
1735 0.08284854888916016
1736 0.08277717232704163
1737 0.08270590752363205
1738 0.08263473212718964
1739 0.08256365358829498
1740 0.0824926570057869
1741 0.08242180198431015
1742 0.08235099911689758
1743 0.08228025585412979
1744 0.08220963925123215
1745 0.08213911205530167
1746 0.08206868916749954
1747 0.0819983184337616
1748 0.0819280818104744
1749 0.08185790479183197
1750 0.0817878395318985
1751 0.08171787112951279
1752 0.08164796978235245
1753 0.08157817274332047
1754 0.08150846511125565
1755 0.08143884688615799
1756 0.08136928826570511
1757 0.08129972964525223
1758 0.0812302827835083
1759 0.08116088807582855
1760 0.08109156787395477
1761 0.08102238178253174
1762 0.08095327764749527
1763 0.08088422566652298
1764 0.08081533014774323
1765 0.08074649423360825
1766 0.08067770302295685
1767 0.08060906082391739
1768 0.0805404856801033
1769 0.08047199249267578
1770 0.08040358871221542
1771 0.08033527433872223
1772 0.0802670419216156
1773 0.08019893616437912
1774 0.08013089001178741
1775 0.08006291836500168
1776 0.0799950510263443
1777 0.07992725819349289
1778 0.07985953986644745
1779 0.07979193329811096
1780 0.07972441613674164
1781 0.07965697348117828
1782 0.07958962023258209
1783 0.07952237874269485
1784 0.0794551819562912
1785 0.0793880894780159
1786 0.07932109385728836
1787 0.07925417274236679
1788 0.07918731123209
1789 0.07912059128284454
1790 0.07905389368534088
1791 0.07898733764886856
1792 0.07892082631587982
1793 0.07885441929101944
1794 0.07878809422254562
1795 0.07872184365987778
1796 0.07865568995475769
1797 0.07858961075544357
1798 0.07852363586425781
1799 0.07845771312713623
1800 0.0783919095993042
1801 0.07832615822553635
1802 0.07826051861047745
1803 0.07819492369890213
1804 0.07812945544719696
1805 0.07806403934955597
1806 0.07799869775772095
1807 0.07793348282575607
1808 0.07786831259727478
1809 0.07780326902866364
1810 0.07773824036121368
1811 0.07767333835363388
1812 0.07760851085186005
1813 0.07754376530647278
1814 0.07747910171747208
1815 0.07741452008485794
1816 0.07735003530979156
1817 0.07728560268878937
1818 0.07722128182649612
1819 0.07715706527233124
1820 0.07709291577339172
1821 0.07702881842851639
1822 0.07696482539176941
1823 0.0769009068608284
1824 0.07683706283569336
1825 0.07677330821752548
1826 0.07670962810516357
1827 0.07664602249860764
1828 0.07658252120018005
1829 0.07651905715465546
1830 0.07645576447248459
1831 0.07639254629611969
1832 0.07632942497730255
1833 0.07626636326313019
1834 0.0762033686041832
1835 0.07614046335220337
1836 0.07607761770486832
1837 0.07601486146450043
1838 0.07595221698284149
1839 0.07588960975408554
1840 0.07582708448171616
1841 0.07576467096805573
1842 0.07570230215787888
1843 0.07564002275466919
1844 0.07557782530784607
1845 0.07551569491624832
1846 0.07545364648103714
1847 0.07539165765047073
1848 0.07532975822687149
1849 0.07526794075965881
1850 0.0752062126994133
1851 0.07514452934265137
1852 0.0750829428434372
1853 0.07502143830060959
1854 0.07496000081300735
1855 0.07489863783121109
1856 0.07483737170696259
1857 0.07477617263793945
1858 0.0747150108218193
1859 0.07465395331382751
1860 0.07459298521280289
1861 0.07453206181526184
1862 0.07447121292352676
1863 0.07441044598817825
1864 0.07434975355863571
1865 0.07428912818431854
1866 0.07422858476638794
1867 0.07416810095310211
1868 0.07410772889852524
1869 0.07404737919569016
1870 0.07398713380098343
1871 0.07392696291208267
1872 0.0738668441772461
1873 0.07380683720111847
1874 0.07374686747789383
1875 0.07368697226047516
1876 0.07362718135118484
1877 0.07356742024421692
1878 0.07350776344537735
1879 0.07344818860292435
1880 0.07338867336511612
1881 0.07332924008369446
1882 0.07326986640691757
1883 0.07321056723594666
1884 0.0731513574719429
1885 0.07309219986200333
1886 0.0730331540107727
1887 0.07297413051128387
1888 0.07291519641876221
1889 0.0728563442826271
1890 0.07279753684997559
1891 0.07273881882429123
1892 0.07268017530441284
1893 0.07262159138917923
1894 0.07256310433149338
1895 0.07250466197729111
1896 0.0724463015794754
1897 0.07238800078630447
1898 0.07232978194952011
1899 0.07227162271738052
1900 0.07221353054046631
1901 0.07215555757284164
1902 0.07209757715463638
1903 0.07203973084688187
1904 0.07198192179203033
1905 0.07192417979240417
1906 0.07186652719974518
1907 0.07180892676115036
1908 0.07175140082836151
1909 0.07169394195079803
1910 0.07163655757904053
1911 0.07157924771308899
1912 0.07152201980352402
1913 0.07146485149860382
1914 0.07140778005123138
1915 0.07135072350502014
1916 0.07129377871751785
1917 0.07123686373233795
1918 0.071180060505867
1919 0.07112328708171844
1920 0.07106661051511765
1921 0.07100997120141983
1922 0.07095342129468918
1923 0.0708969458937645
1924 0.07084052264690399
1925 0.07078415900468826
1926 0.07072789967060089
1927 0.0706716775894165
1928 0.0706155076622963
1929 0.07055944949388504
1930 0.07050342112779617
1931 0.07044748961925507
1932 0.07039161026477814
1933 0.0703357681632042
1934 0.07027990370988846
1935 0.0702241063117981
1936 0.07016836851835251
1937 0.07011271268129349
1938 0.07005712389945984
1939 0.07000158727169037
1940 0.06994611024856567
1941 0.06989070773124695
1942 0.06983539462089539
1943 0.06978011876344681
1944 0.069724902510643
1945 0.06966976821422577
1946 0.06961469352245331
1947 0.06955970078706741
1948 0.0695047453045845
1949 0.06944988667964935
1950 0.06939507275819778
1951 0.0693403109908104
1952 0.06928564608097076
1953 0.0692310482263565
1954 0.06917648762464523
1955 0.06912198662757874
1956 0.0690675750374794
1957 0.06901322305202484
1958 0.06895890831947327
1959 0.06890471279621124
1960 0.06885052472352982
1961 0.06879642605781555
1962 0.06874239444732666
1963 0.06868844479322433
1964 0.0686345174908638
1965 0.06858064979314804
1966 0.06852687150239944
1967 0.06847316026687622
1968 0.06841950118541718
1969 0.06836588680744171
1970 0.06831236183643341
1971 0.06825891137123108
1972 0.06820549070835114
1973 0.06815215945243835
1974 0.06809888780117035
1975 0.06804566830396652
1976 0.06799252331256866
1977 0.06793942302465439
1978 0.06788638979196548
1979 0.06783339381217957
1980 0.067780502140522
1981 0.06772767752408981
1982 0.06767487525939941
1983 0.06762216240167618
1984 0.06756949424743652
1985 0.06751693785190582
1986 0.06746440380811691
1987 0.06741190701723099
1988 0.06735949963331223
1989 0.06730712950229645
1990 0.06725485622882843
1991 0.0672026053071022
1992 0.06715045869350433
1993 0.06709831953048706
1994 0.06704625487327576
1995 0.06699429452419281
1996 0.06694235652685165
1997 0.06689047068357468
1998 0.06683865189552307
1999 0.06678692251443863
2000 0.06673522293567657
2001 0.0666835755109787
2002 0.06663201749324799
2003 0.06658049672842026
2004 0.0665290430188179
2005 0.06647765636444092
2006 0.06642626225948334
2007 0.06637495011091232
2008 0.06632369756698608
2009 0.06627251952886581
2010 0.06622138619422913
2011 0.06617029011249542
2012 0.06611927598714828
2013 0.06606832146644592
2014 0.06601741164922714
2015 0.06596659868955612
2016 0.0659157931804657
2017 0.06586505472660065
2018 0.06581439822912216
2019 0.06576378643512726
2020 0.06571321934461594
2021 0.06566271185874939
2022 0.06561229377985
2023 0.06556190550327301
2024 0.06551158428192139
2025 0.06546132266521454
2026 0.06541110575199127
2027 0.06536094844341278
2028 0.06531085073947906
2029 0.06526082754135132
2030 0.06521083414554596
2031 0.06516091525554657
2032 0.06511103361845016
2033 0.06506123393774033
2034 0.06501150876283646
2035 0.06496177613735199
2036 0.06491214036941528
2037 0.06486254930496216
2038 0.0648130252957344
2039 0.06476354598999023
2040 0.06471412628889084
2041 0.06466478109359741
2042 0.06461548805236816
2043 0.0645662471652031
2044 0.06451703608036041
2045 0.0644678920507431
2046 0.06441880762577057
2047 0.064369797706604
2048 0.06432081758975983
2049 0.06427189707756042
2050 0.0642230361700058
2051 0.06417424976825714
2052 0.06412550061941147
2053 0.06407680362462997
2054 0.06402815878391266
2055 0.06397957354784012
2056 0.06393106281757355
2057 0.06388258188962936
2058 0.06383416801691055
2059 0.06378579139709473
2060 0.06373748183250427
2061 0.0636892318725586
2062 0.0636410191655159
2063 0.06359288841485977
2064 0.06354479491710663
2065 0.06349676102399826
2066 0.06344877183437347
2067 0.06340081989765167
2068 0.06335297226905823
2069 0.06330515444278717
2070 0.06325738877058029
2071 0.0632096603512764
2072 0.06316199898719788
2073 0.06311440467834473
2074 0.06306684017181396
2075 0.06301935762166977
2076 0.06297190487384796
2077 0.06292451173067093
2078 0.06287717819213867
2079 0.0628298968076706
2080 0.06278263032436371
2081 0.06273546814918518
2082 0.06268833577632904
2083 0.06264127045869827
2084 0.06259430199861526
2085 0.06254753470420837
2086 0.06250081211328506
2087 0.062454137951135635
2088 0.062407538294792175
2089 0.0623609684407711
2090 0.06231443211436272
2091 0.06226780638098717
2092 0.062221262603998184
2093 0.06217479333281517
2094 0.062128372490406036
2095 0.06208197399973869
2096 0.062035657465457916
2097 0.06198937073349953
2098 0.06194315478205681
2099 0.061896976083517075
2100 0.061850856989622116
2101 0.06180478632450104
2102 0.061758749186992645
2103 0.061712779104709625
2104 0.06166686490178108
2105 0.06162097305059433
2106 0.061575133353471756
2107 0.06152936443686485
2108 0.06148362159729004
2109 0.061437930911779404
2110 0.06139231100678444
2111 0.06134672835469246
2112 0.06130118668079376
2113 0.061255715787410736
2114 0.0612102672457695
2115 0.06116488203406334
2116 0.06111956387758255
2117 0.06107426434755325
2118 0.06102903187274933
2119 0.06098382547497749
2120 0.06093870475888252
2121 0.060893602669239044
2122 0.060848575085401535
2123 0.060803573578596115
2124 0.06075861304998398
2125 0.0607137456536293
2126 0.060668881982564926
2127 0.06062408164143562
2128 0.0605793334543705
2129 0.06053464487195015
2130 0.06048998609185219
2131 0.0604453943669796
2132 0.060400836169719696
2133 0.06035632640123367
2134 0.060311876237392426
2135 0.06026746332645416
2136 0.060223110020160675
2137 0.06017880514264107
2138 0.06013454124331474
2139 0.0600903294980526
2140 0.060046155005693436
2141 0.06000203266739845
2142 0.059957969933748245
2143 0.059913940727710724
2144 0.05986996367573738
2145 0.05982603877782822
2146 0.059782158583402634
2147 0.05973831191658974
2148 0.059694547206163406
2149 0.05965080112218857
2150 0.05960711091756821
2151 0.05956345796585083
2152 0.059519875794649124
2153 0.05947631597518921
2154 0.059432804584503174
2155 0.059389349073171616
2156 0.05934593454003334
2157 0.059302572160959244
2158 0.05925925821065903
2159 0.05921598896384239
2160 0.05917276069521904
2161 0.05912959203124046
2162 0.059086453169584274
2163 0.05904337391257286
2164 0.05900033563375473
2165 0.05895734205842018
2166 0.05891440063714981
2167 0.05887151136994362
2168 0.05882866308093071
2169 0.05878584831953049
2170 0.05874308571219444
2171 0.05870036408305168
2172 0.0586576908826828
2173 0.05861508101224899
2174 0.05857248976826668
2175 0.05852996185421944
2176 0.05848746746778488
2177 0.0584450401365757
2178 0.058402642607688904
2179 0.058360304683446884
2180 0.05831798538565636
2181 0.058275725692510605
2182 0.05823351442813873
2183 0.05819135159254074
2184 0.05814922973513603
2185 0.058107148855924606
2186 0.05806512385606766
2187 0.05802313610911369
2188 0.05798118934035301
2189 0.057939283549785614
2190 0.057897429913282394
2191 0.05785563215613365
2192 0.057813867926597595
2193 0.057772137224674225
2194 0.05773045867681503
2195 0.05768883228302002
2196 0.057647235691547394
2197 0.05760570615530014
2198 0.05756419524550438
2199 0.057522740215063095
2200 0.057481322437524796
2201 0.05743996053934097
2202 0.05739864334464073
2203 0.057357363402843475
2204 0.05731611326336861
2205 0.05727492272853851
2206 0.057233776897192
2207 0.057192668318748474
2208 0.057151615619659424
2209 0.057110581547021866
2210 0.05706961080431938
2211 0.05702868476510048
2212 0.056987784802913666
2213 0.05694693699479103
2214 0.056906141340732574
2215 0.056865375488996506
2216 0.05682464316487312
2217 0.05678397789597511
2218 0.05674334615468979
2219 0.056702759116888046
2220 0.05666220933198929
2221 0.056621719151735306
2222 0.05658124014735222
2223 0.05654080584645271
2224 0.05650044232606888
2225 0.056460101157426834
2226 0.05641980469226837
2227 0.056379545480012894
2228 0.05633935332298279
2229 0.05629918724298477
2230 0.05625906214118004
2231 0.056218985468149185
2232 0.05617895349860191
2233 0.05613894760608673
2234 0.05609899014234543
2235 0.05605907365679741
2236 0.05601920187473297
2237 0.055979374796152115
2238 0.05593958497047424
2239 0.055899836122989655
2240 0.05586010590195656
2241 0.05582045018672943
2242 0.0557808056473732
2243 0.055741190910339355
2244 0.055701613426208496
2245 0.055662088096141815
2246 0.05562259629368782
2247 0.055583126842975616
2248 0.055543724447488785
2249 0.05550434812903404
2250 0.055465005338191986
2251 0.05542571097612381
2252 0.055386461317539215
2253 0.0553472563624382
2254 0.05530809238553047
2255 0.05526895448565483
2256 0.05522988364100456
2257 0.055190809071063995
2258 0.055151816457509995
2259 0.05511284992098808
2260 0.05507391691207886
2261 0.05503503233194351
2262 0.05499619245529175
2263 0.054957374930381775
2264 0.05491860583424568
2265 0.05487988516688347
2266 0.054841212928295135
2267 0.05480256304144859
2268 0.054763954132795334
2269 0.0547252893447876
2270 0.054686471819877625
2271 0.05464768782258034
2272 0.054608963429927826
2273 0.054570261389017105
2274 0.054531604051589966
2275 0.05449299514293671
2276 0.05445442348718643
2277 0.05441589280962944
2278 0.054377391934394836
2279 0.05433893948793411
2280 0.054300520569086075
2281 0.05426214635372162
2282 0.05422382056713104
2283 0.05418551340699196
2284 0.05414726585149765
2285 0.05410904809832573
2286 0.05407087132334709
2287 0.054032739251852036
2288 0.05399463325738907
2289 0.05395657196640968
2290 0.05391855165362358
2291 0.05388057231903076
2292 0.05384262278676033
2293 0.05380474030971527
2294 0.05376685410737991
2295 0.053729042410850525
2296 0.053691256791353226
2297 0.05365350469946861
2298 0.05361579731106758
2299 0.05357813090085983
2300 0.05354049801826477
2301 0.05350291728973389
2302 0.05346536263823509
2303 0.05342784896492958
2304 0.05339037999510765
2305 0.05335293337702751
2306 0.05331555753946304
2307 0.05327818915247917
2308 0.053240858018398285
2309 0.05320359021425247
2310 0.05316634103655815
2311 0.053129132837057114
2312 0.05309196561574936
2313 0.053054843097925186
2314 0.053017761558294296
2315 0.0529807023704052
2316 0.052943699061870575
2317 0.05290672183036804
2318 0.052869781851768494
2319 0.05283288285136223
2320 0.05279601737856865
2321 0.052759211510419846
2322 0.05272240564227104
2323 0.052685655653476715
2324 0.05264895781874657
2325 0.05261228606104851
2326 0.05257565155625343
2327 0.05253905430436134
2328 0.05250249058008194
2329 0.05246596410870552
2330 0.05242948606610298
2331 0.05239303782582283
2332 0.05235663428902626
2333 0.05232024937868118
2334 0.052283916622400284
2335 0.05224761366844177
2336 0.052211351692676544
2337 0.052175119519233704
2338 0.05213890224695206
2339 0.05210274085402489
2340 0.052066605538129807
2341 0.05203051492571831
2342 0.0519944466650486
2343 0.05195842310786247
2344 0.051922447979450226
2345 0.051886510103940964
2346 0.05185059458017349
2347 0.05181471258401871
2348 0.0517788864672184
2349 0.05174308270215988
2350 0.05170730873942375
2351 0.0516715832054615
2352 0.05163589492440224
2353 0.05160023272037506
2354 0.05156461521983147
2355 0.05152903124690056
2356 0.051493484526872635
2357 0.051457975059747696
2358 0.05142248794436455
2359 0.05138704925775528
2360 0.0513516440987587
2361 0.0513162799179554
2362 0.05128094553947449
2363 0.05124564468860626
2364 0.05121038109064102
2365 0.05117515102028847
2366 0.0511399507522583
2367 0.051104795187711716
2368 0.05106968432664871
2369 0.051034603267908096
2370 0.05099954828619957
2371 0.05096454173326492
2372 0.05092955753207207
2373 0.05089461803436279
2374 0.05085970088839531
2375 0.05082482844591141
2376 0.05078998953104019
2377 0.05075518414378166
2378 0.050720419734716415
2379 0.05068567767739296
2380 0.05065099149942398
2381 0.0506163090467453
2382 0.050581689924001694
2383 0.05054708570241928
2384 0.05051251873373985
2385 0.0504780150949955
2386 0.05044351890683174
2387 0.050409067422151566
2388 0.05037463456392288
2389 0.050340261310338974
2390 0.05030589550733566
2391 0.05027158185839653
2392 0.05023728683590889
2393 0.050203047692775726
2394 0.05016881972551346
2395 0.050134651362895966
2396 0.05010049790143967
2397 0.05006638169288635
2398 0.05003230646252632
2399 0.049998264759778976
2400 0.049964237958192825
2401 0.04993026703596115
2402 0.04989632964134216
2403 0.04986241087317467
2404 0.049828533083200455
2405 0.04979468137025833
2406 0.049760885536670685
2407 0.04972708970308304
2408 0.049693357199430466
2409 0.04965965822339058
2410 0.049625977873802185
2411 0.049592338502407074
2412 0.049558717757463455
2413 0.04952514171600342
2414 0.049491602927446365
2415 0.049458097666502
2416 0.04942462593317032
2417 0.04939119145274162
2418 0.049357760697603226
2419 0.049324389547109604
2420 0.04929105564951897
2421 0.049257759004831314
2422 0.04922446608543396
2423 0.049191225320100784
2424 0.0491579994559288
2425 0.049124825745821
2426 0.04909168556332588
2427 0.049058567732572556
2428 0.04902549460530281
2429 0.04899244010448456
2430 0.048959411680698395
2431 0.04892643541097641
2432 0.04889349266886711
2433 0.048860568553209305
2434 0.04882769286632538
2435 0.048794809728860855
2436 0.04876202344894409
2437 0.04872921109199524
2438 0.04869646951556206
2439 0.04866374284029007
2440 0.04863104596734047
2441 0.048598386347293854
2442 0.04856575280427933
2443 0.04853314906358719
2444 0.04850059375166893
2445 0.04846806824207306
2446 0.04843556508421898
2447 0.048403095453977585
2448 0.048370663076639175
2449 0.04833825305104256
2450 0.04830586537718773
2451 0.04827352985739708
2452 0.04824121668934822
2453 0.048208944499492645
2454 0.04817669466137886
2455 0.048144470900297165
2456 0.04811227694153786
2457 0.04808012396097183
2458 0.04804801195859909
2459 0.04801591485738754
2460 0.04798385128378868
2461 0.047951821237802505
2462 0.047919824719429016
2463 0.047887858003377914
2464 0.0478559210896492
2465 0.04782402515411377
2466 0.047792136669158936
2467 0.04776029288768768
2468 0.04772849380970001
2469 0.04769670218229294
2470 0.047664958983659744
2471 0.04763323813676834
2472 0.047601547092199326
2473 0.047569893300533295
2474 0.04753825068473816
2475 0.047506656497716904
2476 0.04747509956359863
2477 0.04744357615709305
2478 0.04741206765174866
2479 0.047380588948726654
2480 0.04734913632273674
2481 0.047317732125520706
2482 0.047286342829465866
2483 0.04725499823689461
2484 0.04722368344664574
2485 0.04719240218400955
2486 0.0471612922847271
2487 0.04713021218776703
2488 0.047099169343709946
2489 0.04706813395023346
2490 0.04703715816140175
2491 0.04700620472431183
2492 0.046975281089544296
2493 0.04694436863064766
2494 0.046913500875234604
2495 0.046882666647434235
2496 0.04685184732079506
2497 0.04682106152176857
2498 0.04679032415151596
2499 0.046759601682424545
In [51]:
import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel('Loss')
plt.xlabel('epoch')

plt.show
Out[51]:
<function matplotlib.pyplot.show(*args, **kw)>

Forecast based on the model

  • substitute the same equations that were in the model
  • The following loss result shows the last model sequence
  • Loss shows how much the model is wrong (loss = sum of error squares) after the last learning sequence

Prognoza na podstawie modelu

  • podstawiamy te same równania, które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
    obraz.png
In [26]:
with torch.no_grad():
    y_pred = model(X)  
    loss = (y_pred - y).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 0.04672889

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [27]:
y_pred[:5]
Out[27]:
tensor([[0.0297],
        [0.0404],
        [0.0731],
        [0.0552],
        [0.0586]])

We save the whole model

Zapisujemy cały model

In [28]:
torch.save(model,'/home/wojciech/Pulpit/7/byk12.pb')

We play the whole model

Odtwarzamy cały model

In [29]:
KOT = torch.load('/home/wojciech/Pulpit/7/byk12.pb')
KOT.eval()
Out[29]:
Sequential(
  (0): Linear(in_features=11, out_features=30, bias=True)
  (1): ReLU()
  (2): Linear(in_features=30, out_features=1, bias=True)
)

By substituting other independent variables, you can get a vector of output variables

We choose a random record from the tensor

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord z tensora

In [30]:
X_exp = X[85] 
X_exp
Out[30]:
tensor([ 0.1112,  0.9693,  1.1518, -2.1961, -1.8163, -1.4758,  0.1856, -0.5823,
         0.6798,  1.2684, -1.2157])
In [31]:
y_exp = y[85]
y_exp
Out[31]:
tensor([0.0429])
In [32]:
y_pred_exp = model(X_exp)
y_pred_exp
Out[32]:
tensor([0.0459], grad_fn=<AddBackward0>)
In [33]:
y_pred*100
Out[33]:
tensor([[2.9664],
        [4.0416],
        [7.3108],
        [5.5216],
        [5.8579],
        [4.8767],
        [5.6226],
        [5.9540],
        [6.6332],
        [8.8765],
        [2.3783],
        [7.1672],
        [6.3292],
        [3.3638],
        [5.9914],
        [6.0794],
        [5.7204],
        [2.7975],
        [6.6604],
        [4.9272],
        [4.8879],
        [2.9091],
        [5.4933],
        [2.5811],
        [7.4032],
        [5.1133],
        [4.7074],
        [5.6793],
        [4.5719],
        [5.7311],
        [3.2888],
        [2.8930],
        [5.0066],
        [6.1134],
        [3.6609],
        [5.4406],
        [5.9087],
        [7.8294],
        [3.7849],
        [5.3835],
        [6.4967],
        [5.9825],
        [6.5347],
        [5.6162],
        [3.9004],
        [7.9095],
        [6.4070],
        [3.0974],
        [3.9098],
        [6.2177],
        [4.9921],
        [4.4564],
        [5.3692],
        [3.4037],
        [6.1061],
        [6.3973],
        [5.5695],
        [5.7126],
        [7.7393],
        [4.9747],
        [6.4765],
        [6.3071],
        [5.9212],
        [7.6323],
        [7.9976],
        [9.0414],
        [2.4912],
        [5.4853],
        [4.2362],
        [5.4228],
        [6.1679],
        [4.9260],
        [6.7854],
        [6.1889],
        [4.3402],
        [6.1307],
        [5.5957],
        [5.6962],
        [4.7096],
        [5.1419],
        [6.2473],
        [5.1711],
        [6.6281],
        [5.3054],
        [3.7138],
        [4.5884],
        [5.4899],
        [2.9835],
        [6.5399],
        [5.1095],
        [5.4376],
        [6.6325],
        [6.2240],
        [4.9002],
        [4.2162],
        [3.9430],
        [5.5380],
        [3.6677],
        [6.1445],
        [8.3954],
        [6.9688],
        [4.8948],
        [3.3532],
        [5.0904],
        [5.2565],
        [7.3518],
        [5.6310],
        [5.7561],
        [6.2047],
        [6.9834],
        [5.3382],
        [4.8129],
        [2.7476],
        [5.3832],
        [4.9981],
        [7.3488],
        [4.7036],
        [4.1798],
        [2.0787],
        [7.5565],
        [4.4558],
        [5.4520],
        [2.0349],
        [6.9208],
        [6.9492],
        [6.1634],
        [5.5123],
        [3.5621],
        [5.1587],
        [7.2254],
        [4.1461],
        [5.9688],
        [5.4470],
        [3.6854],
        [7.8049],
        [8.2269],
        [5.1573],
        [6.7360],
        [2.7912],
        [4.2884],
        [7.4852],
        [4.0624],
        [6.2779],
        [4.2037],
        [4.8331],
        [4.4547],
        [2.5030],
        [3.9659],
        [6.7515],
        [8.3830],
        [8.5129],
        [6.7651],
        [5.5284],
        [6.9194],
        [3.9624],
        [3.1702],
        [4.0884],
        [5.7610],
        [3.8812],
        [3.5756],
        [6.2141],
        [5.5166],
        [5.1329],
        [3.7041],
        [5.9317],
        [5.4580],
        [6.7207],
        [6.8466],
        [1.9617],
        [5.9478],
        [4.9075],
        [6.9502],
        [4.4676],
        [5.1748],
        [6.4262],
        [5.2580],
        [2.4653],
        [6.7099],
        [4.6664],
        [3.9866],
        [1.8624],
        [4.7833],
        [4.1117],
        [6.8131],
        [4.9969],
        [5.3558],
        [4.6633],
        [6.5276],
        [2.5698],
        [3.7701],
        [3.2941],
        [6.6634],
        [3.1562],
        [3.2309],
        [6.1113],
        [5.4087],
        [6.2778],
        [5.9426],
        [4.0700],
        [6.0296],
        [7.3920],
        [3.3884],
        [6.3151],
        [5.7413],
        [3.1239],
        [2.6905],
        [6.2215],
        [5.0392],
        [1.9744],
        [6.9342],
        [3.1153],
        [4.8829],
        [5.4011],
        [5.3954],
        [4.7339],
        [7.5059],
        [4.7983],
        [5.3758],
        [5.7488],
        [4.1503],
        [5.7822],
        [7.5587],
        [5.9593],
        [3.7387],
        [5.9640],
        [4.5946],
        [4.3619],
        [7.1376],
        [3.8642],
        [6.3577],
        [6.8040],
        [4.3509],
        [5.7255],
        [5.1782],
        [4.2615],
        [3.5781],
        [6.5557],
        [4.8018],
        [6.8300],
        [5.3526],
        [2.2285],
        [3.4770],
        [6.1449],
        [4.1859],
        [6.1908],
        [3.3807],
        [5.7225],
        [7.8827],
        [6.7356],
        [5.4064],
        [3.6626],
        [4.4502],
        [3.1978],
        [3.2464],
        [4.6420],
        [7.0870],
        [7.4764],
        [5.7756],
        [3.0390],
        [4.7756],
        [4.7188],
        [8.1182],
        [4.5416],
        [5.2138],
        [7.8055],
        [5.7732],
        [5.9122],
        [5.0187],
        [5.9651],
        [5.4615],
        [7.7498],
        [7.3046],
        [5.7209],
        [5.3261],
        [1.9999],
        [8.1639],
        [3.2365],
        [5.5175],
        [5.0731],
        [5.5036],
        [6.7409],
        [7.4398],
        [6.5612],
        [2.6310],
        [3.9384],
        [4.9992],
        [3.9782],
        [7.9541],
        [4.8564],
        [3.5855],
        [7.4833],
        [6.9492],
        [7.9667],
        [2.6751],
        [7.4556],
        [3.4723],
        [3.2654],
        [7.4916],
        [3.7939],
        [7.5347],
        [4.5226],
        [4.3283],
        [6.2889],
        [1.4205],
        [5.1865],
        [7.3664],
        [7.0705],
        [8.6856],
        [7.4042],
        [6.0064],
        [6.5478],
        [4.3864],
        [4.2359],
        [4.1898],
        [4.4255],
        [3.1806],
        [5.1247],
        [5.7540],
        [5.0307],
        [6.2750],
        [4.3838],
        [5.5981],
        [6.7580],
        [7.8894],
        [7.4011],
        [6.7692],
        [6.6447],
        [6.8985],
        [6.5622],
        [5.9753],
        [3.9478],
        [4.7453],
        [3.8296],
        [3.8648],
        [6.5571],
        [5.2697],
        [4.6606],
        [2.7551],
        [5.1647],
        [4.3981],
        [7.3970],
        [3.8195],
        [4.6106],
        [6.0730],
        [4.6144],
        [5.9978],
        [5.7666],
        [3.7945],
        [7.3946],
        [4.8461],
        [4.0483],
        [5.8546],
        [7.7021],
        [5.1070],
        [7.0071],
        [5.9592],
        [6.6253],
        [6.2787],
        [4.6751],
        [8.3041],
        [6.5435],
        [4.9875],
        [3.4187],
        [7.9358],
        [5.2559],
        [1.9815],
        [6.7741],
        [3.6748],
        [2.8120],
        [5.5128],
        [5.5181],
        [6.6587],
        [7.0592],
        [4.0100],
        [4.7017],
        [6.3343],
        [7.7795],
        [6.1473],
        [5.9915],
        [4.6752],
        [6.1017],
        [4.5512],
        [6.6151],
        [5.8156],
        [4.7634],
        [4.9142],
        [6.5041],
        [5.0676],
        [5.8448],
        [6.1546],
        [4.1216],
        [3.4364],
        [5.0902],
        [6.3010],
        [7.2167],
        [5.9601],
        [3.1539],
        [4.6667],
        [4.7068],
        [5.4171],
        [5.0741],
        [4.1897],
        [6.1055],
        [7.8151],
        [6.7116],
        [4.2754],
        [5.9288],
        [5.7566],
        [5.6452],
        [4.2433],
        [5.2321],
        [6.9282],
        [8.3004],
        [7.1942],
        [7.5665],
        [3.9228],
        [5.0916],
        [6.3853],
        [8.9201],
        [4.0894],
        [5.0578],
        [5.4298],
        [5.3945],
        [6.8777],
        [6.0253],
        [4.6407],
        [5.5930],
        [6.0872],
        [7.4040],
        [5.4820],
        [4.1105],
        [7.4705],
        [6.6388],
        [6.4490],
        [4.5783],
        [6.1771],
        [6.6494],
        [4.7957],
        [5.9815],
        [3.7596],
        [6.2458],
        [2.6563],
        [6.3460],
        [5.0271],
        [2.7388],
        [8.7505],
        [9.3717],
        [2.4806],
        [6.5091],
        [4.3372],
        [4.2973],
        [4.3720],
        [3.3563],
        [5.2429],
        [4.9201],
        [5.2773],
        [4.9870],
        [4.1075],
        [3.6772],
        [6.9761],
        [5.8244],
        [4.7021],
        [5.5312],
        [3.6516],
        [5.1328],
        [5.7880],
        [4.0189],
        [4.9219],
        [3.9931]])
In [34]:
df.loc[85,'Happiness Score']
Out[34]:
5.007
In [35]:
(y_exp - y_pred_exp).pow(2).sum()
Out[35]:
tensor(8.7851e-06, grad_fn=<SumBackward0>)

r2_score_compute

obraz.png

In [36]:
def r2_score_compute_fn(y_pred, y):
    e = torch.sum((y_pred-y.mean()) ** 2) / torch.sum((y - y.mean()) ** 2)
    return 1 - e.item()
In [37]:
r2_score_compute_fn(y, y_pred)
Out[37]:
0.4201156497001648
In [46]:
a = pd.DataFrame(y)
b = pd.DataFrame(y_pred)

Artykuł Pytorch regression _2.1_ [WorldHappinessReport.csv] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
Pytorch regression _1.1_[WorldHappinessReport] https://sigmaquality.pl/models/pytorch/pytorch_1-1_regression-worldhappinessreport-290420201753/ Wed, 29 Apr 2020 15:55:44 +0000 http://sigmaquality.pl/pytorch_1-1_regression-worldhappinessreport-290420201753/ 290420201753. Tworzenie małych prototypów o pełnej zdolności bojowej Cele: 1. podstawić prawdziwy plik danych 2. przeliczyć zapamiętać model odpalić model 3. zweryfikowac model czy liczy [...]

Artykuł Pytorch regression _1.1_[WorldHappinessReport] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>
290420201753.

Tworzenie małych prototypów o pełnej zdolności bojowej

obraz.png

Cele:

1. podstawić prawdziwy plik danych
2. przeliczyć zapamiętać model odpalić model
3. zweryfikowac model czy liczy dla innych danych
4. Cel extra - zrobić waluację modelu 





https://github.com/jcjohnson/pytorch-examples#pytorch-custom-nn-modules
10:10 10:50
11:05

W powyższych przykładach musieliśmy ręcznie wdrożyć zarówno przejścia do przodu, jak i do tyłu naszej sieci neuronowej. Ręczne wdrożenie wstecznego przejścia nie jest wielkim problemem dla małej sieci dwuwarstwowej, ale może szybko stać się bardzo kłopotliwe dla dużych złożonych sieci.

In [1]:
import torch
import torch.nn as nn

Odpalam karte graficzną GPU

In [2]:
device = torch.device('cpu') # obliczenia robie na CPU
#device = torch.device('cuda') # obliczenia robie na GPU
In [3]:
import pandas as pd

df = pd.read_csv('/home/wojciech/Pulpit/1/WorldHappinessReport.csv')
df.head(3)
Out[3]:
Unnamed: 0 Country Region Happiness Rank Happiness Score Economy (GDP per Capita) Family Health (Life Expectancy) Freedom Trust (Government Corruption) Generosity Dystopia Residual Year
0 0 Afghanistan Southern Asia 153.0 3.575 0.31982 0.30285 0.30335 0.23414 0.09719 0.36510 1.95210 2015.0
1 1 Albania Central and Eastern Europe 95.0 4.959 0.87867 0.80434 0.81325 0.35733 0.06413 0.14272 1.89894 2015.0
2 2 Algeria Middle East and Northern Africa 68.0 5.605 0.93929 1.07772 0.61766 0.28579 0.17383 0.07822 2.43209 2015.0
In [4]:
N, D_in, H, D_out = 64, 1000, 100, 10

Powyżej zostały określone parametry aby tensor zmiennych niezależnych i tensor wynikowy były odpowiednie

x: 64 obserwacji i 1000 zmiennych

y: 64 obserwacji i 10 zmiennych

2. Usuwanie pustych komórek NaN

Sieci nieuronowe nie lubiś pustych komórek NaN -jak tego nie zrobimy wyjdzie nam NaN

In [5]:
df = df.dropna(how ='any')
df.isnull().sum()
Out[5]:
Unnamed: 0                       0
Country                          0
Region                           0
Happiness Rank                   0
Happiness Score                  0
Economy (GDP per Capita)         0
Family                           0
Health (Life Expectancy)         0
Freedom                          0
Trust (Government Corruption)    0
Generosity                       0
Dystopia Residual                0
Year                             0
dtype: int64

Tworzy losowe dane wejściowe i wyjściowe

In [6]:
X = torch.tensor((df['Economy (GDP per Capita)'],df['Freedom'],df['Trust (Government Corruption)']), dtype=torch.float)
X
Out[6]:
tensor([[0.3198, 0.8787, 0.9393,  ..., 0.5917, 0.6364, 0.3758],
        [0.2341, 0.3573, 0.2858,  ..., 0.2495, 0.4616, 0.3364],
        [0.0972, 0.0641, 0.1738,  ..., 0.0568, 0.0782, 0.0954]])

3.1 TRansponuje wektor zmiennych niezależnych aby stał się kolumną

In [7]:
X = torch.transpose(X.flip(0),0,1)
X
Out[7]:
tensor([[0.0972, 0.2341, 0.3198],
        [0.0641, 0.3573, 0.8787],
        [0.1738, 0.2858, 0.9393],
        ...,
        [0.0568, 0.2495, 0.5917],
        [0.0782, 0.4616, 0.6364],
        [0.0954, 0.3364, 0.3758]])

4. Przekształcanie na tensor zmiennych zależnych

Jako dane wynikowe wybrano zmienną: 'Happiness Score’

In [8]:
y = torch.tensor((df['Happiness Score']), dtype=torch.float)
#y

4.1 TRansponuje wektor wynikowy aby stał się kolumną

In [9]:
y = y.view(y.shape[0],1)
y[:12]
Out[9]:
tensor([[3.5750],
        [4.9590],
        [5.6050],
        [4.0330],
        [6.5740],
        [4.3500],
        [7.2840],
        [7.2000],
        [5.2120],
        [5.9600],
        [4.6940],
        [5.8130]])

x = torch.randn(N, D_in, device=device)
y = torch.randn(N, D_out, device=device)

In [10]:
print('X ',X.size())
print('y ',y.size())
X  torch.Size([469, 3])
y  torch.Size([469, 1])

Tworzy wagi losowe dla x i y

# Create random input and output data
x = np.random.randn(N, D_in)
y = np.random.randn(N, D_out)

# Randomly initialize weights
w1 = np.random.randn(D_in, H)
w2 = np.random.randn(H, D_out)

In [11]:
N, D_in = X.shape
N, D_out = y.shape
In [12]:
H = 10
device = torch.device('cpu')
In [13]:
w1 = torch.randn(D_in, H, device=device)
w2 = torch.randn(H, D_out, device=device)
In [14]:
w1[:2]
Out[14]:
tensor([[-1.6132, -1.3645, -1.3592, -0.0723, -0.2487, -0.5041, -0.0603,  0.5292,
          1.6465,  0.4280],
        [ 0.5460, -0.4100, -1.5326, -0.9086,  1.1751, -0.3382,  0.3155,  0.1122,
          0.3900,  0.8720]])
In [15]:
print('w1 dla x: ', w1.shape)
print('w2 dla y: ', w2.shape)
w1 dla x:  torch.Size([3, 10])
w2 dla y:  torch.Size([10, 1])

Definiowanie nauki – NAUKA MODELU

In [16]:
epochs = 2500
aggregated_losses = []


learning_rate = 0.0001     #<= wielkość kroków
for t in range(epochs):         #<= ilość epok
 
  h = X.mm(w1)               #<= zwykłe mnożenie macierzy x*w1
  h_relu = h.clamp(min=0)    #<= wyznaczenie ograniczenia do min=0
  y_pred = h_relu.mm(w2)     #<= pomnożenie przez daje predykcję y_pred
## Wyznaczenie straty (loss) jako wskaźnika r2
## Strata Loss jest skalarem i jest przechowywana w tensorze PyTorcha(); 
## możemy uzyskać jego wartość jako liczbę w języku Python za pomocą loss.item ().
  loss = (y_pred - y).pow(2).sum()
  
  aggregated_losses.append(loss) ## potrzebne do wykresu

  print('Krok:',t, loss.item())        # Dla każdej pentli drukuje wynik r2
  # Backprop do obliczania gradientów w1 i w2 w odniesieniu do strat
  grad_y_pred = 2.0 * (y_pred - y)      #błąd modelu razy 2
  grad_w2 = h_relu.t().mm(grad_y_pred)  #
  grad_h_relu = grad_y_pred.mm(w2.t())
  grad_h = grad_h_relu.clone()
  grad_h[h < 0] = 0
  grad_w1 = X.t().mm(grad_h)

  # Zaktualizuj wagi przy użyciu spadku gradientu
  w1 -= learning_rate * grad_w1
  w2 -= learning_rate * grad_w2
Krok: 0 23992.55078125
Krok: 1 839.9432373046875
Krok: 2 677.2001342773438
Krok: 3 671.9268188476562
Krok: 4 668.744140625
Krok: 5 665.714599609375
Krok: 6 662.75390625
Krok: 7 659.8572387695312
Krok: 8 657.022705078125
Krok: 9 654.248779296875
Krok: 10 651.5341796875
Krok: 11 648.8765258789062
Krok: 12 646.2750854492188
Krok: 13 643.7282104492188
Krok: 14 641.2344360351562
Krok: 15 638.7923583984375
Krok: 16 636.4005737304688
Krok: 17 634.058349609375
Krok: 18 631.7642211914062
Krok: 19 629.516845703125
Krok: 20 627.3151245117188
Krok: 21 625.1581420898438
Krok: 22 623.0447998046875
Krok: 23 620.9735107421875
Krok: 24 618.9442138671875
Krok: 25 616.9551391601562
Krok: 26 615.0057983398438
Krok: 27 613.0946655273438
Krok: 28 611.221435546875
Krok: 29 609.3848876953125
Krok: 30 607.584228515625
Krok: 31 605.8187866210938
Krok: 32 604.0875854492188
Krok: 33 602.3897094726562
Krok: 34 600.724365234375
Krok: 35 599.0911254882812
Krok: 36 597.4887084960938
Krok: 37 595.9168090820312
Krok: 38 594.3751220703125
Krok: 39 592.8628540039062
Krok: 40 591.3787231445312
Krok: 41 589.9219970703125
Krok: 42 588.4921875
Krok: 43 587.0885009765625
Krok: 44 585.710693359375
Krok: 45 584.35791015625
Krok: 46 583.0294189453125
Krok: 47 581.7247314453125
Krok: 48 580.443115234375
Krok: 49 579.1838989257812
Krok: 50 577.9467163085938
Krok: 51 576.7305297851562
Krok: 52 575.535888671875
Krok: 53 574.3613891601562
Krok: 54 573.2070922851562
Krok: 55 572.0722045898438
Krok: 56 570.9564819335938
Krok: 57 569.8595581054688
Krok: 58 568.7806396484375
Krok: 59 567.7196044921875
Krok: 60 566.6760864257812
Krok: 61 565.6494140625
Krok: 62 564.63916015625
Krok: 63 563.6448364257812
Krok: 64 562.6660766601562
Krok: 65 561.70458984375
Krok: 66 560.7578125
Krok: 67 559.8258056640625
Krok: 68 558.9073486328125
Krok: 69 558.0025024414062
Krok: 70 557.1112670898438
Krok: 71 556.2333984375
Krok: 72 555.368408203125
Krok: 73 554.5159912109375
Krok: 74 553.6749267578125
Krok: 75 552.845703125
Krok: 76 552.0280151367188
Krok: 77 551.2210083007812
Krok: 78 550.42578125
Krok: 79 549.64306640625
Krok: 80 548.8694458007812
Krok: 81 548.1067504882812
Krok: 82 547.3535766601562
Krok: 83 546.6073608398438
Krok: 84 545.8716430664062
Krok: 85 545.1450805664062
Krok: 86 544.4281005859375
Krok: 87 543.7227172851562
Krok: 88 543.02880859375
Krok: 89 542.34521484375
Krok: 90 541.6704711914062
Krok: 91 541.0020141601562
Krok: 92 540.3422241210938
Krok: 93 539.6897583007812
Krok: 94 539.0474853515625
Krok: 95 538.4170532226562
Krok: 96 537.7994995117188
Krok: 97 537.1882934570312
Krok: 98 536.5840454101562
Krok: 99 535.990234375
Krok: 100 535.4063110351562
Krok: 101 534.8289794921875
Krok: 102 534.2578125
Krok: 103 533.69482421875
Krok: 104 533.139892578125
Krok: 105 532.5951538085938
Krok: 106 532.0576171875
Krok: 107 531.5263061523438
Krok: 108 530.9998779296875
Krok: 109 530.4786376953125
Krok: 110 529.9617309570312
Krok: 111 529.448486328125
Krok: 112 528.942138671875
Krok: 113 528.4420776367188
Krok: 114 527.951416015625
Krok: 115 527.4661865234375
Krok: 116 526.9862670898438
Krok: 117 526.51123046875
Krok: 118 526.0488891601562
Krok: 119 525.5973510742188
Krok: 120 525.15283203125
Krok: 121 524.7181396484375
Krok: 122 524.2896118164062
Krok: 123 523.8694458007812
Krok: 124 523.4548950195312
Krok: 125 523.0499877929688
Krok: 126 522.6502075195312
Krok: 127 522.2552490234375
Krok: 128 521.8628540039062
Krok: 129 521.4756469726562
Krok: 130 521.0928955078125
Krok: 131 520.71533203125
Krok: 132 520.3458251953125
Krok: 133 519.9842529296875
Krok: 134 519.6289672851562
Krok: 135 519.2808837890625
Krok: 136 518.9381713867188
Krok: 137 518.6051025390625
Krok: 138 518.2769165039062
Krok: 139 517.9533081054688
Krok: 140 517.634765625
Krok: 141 517.320556640625
Krok: 142 517.0106811523438
Krok: 143 516.7052612304688
Krok: 144 516.4058227539062
Krok: 145 516.1111450195312
Krok: 146 515.8265991210938
Krok: 147 515.5458374023438
Krok: 148 515.2694702148438
Krok: 149 514.9970092773438
Krok: 150 514.728515625
Krok: 151 514.4639282226562
Krok: 152 514.2031860351562
Krok: 153 513.9463500976562
Krok: 154 513.6930541992188
Krok: 155 513.4429931640625
Krok: 156 513.1963500976562
Krok: 157 512.9535522460938
Krok: 158 512.7139892578125
Krok: 159 512.47802734375
Krok: 160 512.245361328125
Krok: 161 512.0162353515625
Krok: 162 511.790283203125
Krok: 163 511.5677185058594
Krok: 164 511.3482971191406
Krok: 165 511.1330871582031
Krok: 166 510.92047119140625
Krok: 167 510.70953369140625
Krok: 168 510.5016174316406
Krok: 169 510.29669189453125
Krok: 170 510.09466552734375
Krok: 171 509.8955078125
Krok: 172 509.69921875
Krok: 173 509.50579833984375
Krok: 174 509.31536865234375
Krok: 175 509.1285095214844
Krok: 176 508.94427490234375
Krok: 177 508.76275634765625
Krok: 178 508.583740234375
Krok: 179 508.4073486328125
Krok: 180 508.23345947265625
Krok: 181 508.0621032714844
Krok: 182 507.8931579589844
Krok: 183 507.72686767578125
Krok: 184 507.5629577636719
Krok: 185 507.4009704589844
Krok: 186 507.240478515625
Krok: 187 507.0826721191406
Krok: 188 506.92767333984375
Krok: 189 506.77496337890625
Krok: 190 506.6243591308594
Krok: 191 506.4759216308594
Krok: 192 506.3296203613281
Krok: 193 506.1856689453125
Krok: 194 506.04400634765625
Krok: 195 505.9042663574219
Krok: 196 505.7665710449219
Krok: 197 505.631103515625
Krok: 198 505.49755859375
Krok: 199 505.36602783203125
Krok: 200 505.2362060546875
Krok: 201 505.10833740234375
Krok: 202 504.982421875
Krok: 203 504.8582458496094
Krok: 204 504.73553466796875
Krok: 205 504.61480712890625
Krok: 206 504.49591064453125
Krok: 207 504.3785705566406
Krok: 208 504.2629699707031
Krok: 209 504.1489562988281
Krok: 210 504.03662109375
Krok: 211 503.9259338378906
Krok: 212 503.8170166015625
Krok: 213 503.7096252441406
Krok: 214 503.6039123535156
Krok: 215 503.49969482421875
Krok: 216 503.3968811035156
Krok: 217 503.2955627441406
Krok: 218 503.1955871582031
Krok: 219 503.09716796875
Krok: 220 503.00006103515625
Krok: 221 502.9045715332031
Krok: 222 502.8100891113281
Krok: 223 502.7169494628906
Krok: 224 502.625244140625
Krok: 225 502.5349426269531
Krok: 226 502.44610595703125
Krok: 227 502.35845947265625
Krok: 228 502.2720947265625
Krok: 229 502.18695068359375
Krok: 230 502.10321044921875
Krok: 231 502.0206298828125
Krok: 232 501.93927001953125
Krok: 233 501.859130859375
Krok: 234 501.7800598144531
Krok: 235 501.7021789550781
Krok: 236 501.62530517578125
Krok: 237 501.5495300292969
Krok: 238 501.47467041015625
Krok: 239 501.39984130859375
Krok: 240 501.3260803222656
Krok: 241 501.25341796875
Krok: 242 501.1816101074219
Krok: 243 501.1109313964844
Krok: 244 501.041259765625
Krok: 245 500.9725036621094
Krok: 246 500.9046325683594
Krok: 247 500.837890625
Krok: 248 500.7721252441406
Krok: 249 500.7073974609375
Krok: 250 500.6436462402344
Krok: 251 500.582275390625
Krok: 252 500.5211181640625
Krok: 253 500.4607849121094
Krok: 254 500.4019470214844
Krok: 255 500.34307861328125
Krok: 256 500.28533935546875
Krok: 257 500.228759765625
Krok: 258 500.17230224609375
Krok: 259 500.11712646484375
Krok: 260 500.0626220703125
Krok: 261 500.0082702636719
Krok: 262 499.9557800292969
Krok: 263 499.90283203125
Krok: 264 499.8514099121094
Krok: 265 499.8004455566406
Krok: 266 499.7497253417969
Krok: 267 499.70074462890625
Krok: 268 499.6510925292969
Krok: 269 499.60321044921875
Krok: 270 499.5552673339844
Krok: 271 499.5083923339844
Krok: 272 499.46209716796875
Krok: 273 499.4161682128906
Krok: 274 499.3712463378906
Krok: 275 499.3262939453125
Krok: 276 499.2828063964844
Krok: 277 499.2388916015625
Krok: 278 499.19677734375
Krok: 279 499.15399169921875
Krok: 280 499.11297607421875
Krok: 281 499.07122802734375
Krok: 282 499.0313720703125
Krok: 283 498.99066162109375
Krok: 284 498.95184326171875
Krok: 285 498.9122619628906
Krok: 286 498.87451171875
Krok: 287 498.83599853515625
Krok: 288 498.7991027832031
Krok: 289 498.76171875
Krok: 290 498.7255554199219
Krok: 291 498.6893310546875
Krok: 292 498.654052734375
Krok: 293 498.6190185546875
Krok: 294 498.58428955078125
Krok: 295 498.55047607421875
Krok: 296 498.5163879394531
Krok: 297 498.4837951660156
Krok: 298 498.45025634765625
Krok: 299 498.41851806640625
Krok: 300 498.38623046875
Krok: 301 498.35479736328125
Krok: 302 498.3237609863281
Krok: 303 498.2927551269531
Krok: 304 498.2629699707031
Krok: 305 498.2324523925781
Krok: 306 498.2034912109375
Krok: 307 498.1739196777344
Krok: 308 498.1451721191406
Krok: 309 498.11700439453125
Krok: 310 498.0883483886719
Krok: 311 498.0615539550781
Krok: 312 498.0335693359375
Krok: 313 498.00677490234375
Krok: 314 497.980224609375
Krok: 315 497.9535217285156
Krok: 316 497.9282531738281
Krok: 317 497.902099609375
Krok: 318 497.8766174316406
Krok: 319 497.8522033691406
Krok: 320 497.8268737792969
Krok: 321 497.8029479980469
Krok: 322 497.77874755859375
Krok: 323 497.7546081542969
Krok: 324 497.731689453125
Krok: 325 497.70831298828125
Krok: 326 497.6850280761719
Krok: 327 497.6631164550781
Krok: 328 497.64031982421875
Krok: 329 497.61810302734375
Krok: 330 497.596923828125
Krok: 331 497.57513427734375
Krok: 332 497.553466796875
Krok: 333 497.5329895019531
Krok: 334 497.5121154785156
Krok: 335 497.4911804199219
Krok: 336 497.471435546875
Krok: 337 497.451416015625
Krok: 338 497.4313659667969
Krok: 339 497.4117736816406
Krok: 340 497.3930358886719
Krok: 341 497.3735656738281
Krok: 342 497.35455322265625
Krok: 343 497.33636474609375
Krok: 344 497.318115234375
Krok: 345 497.29949951171875
Krok: 346 497.2812805175781
Krok: 347 497.2643127441406
Krok: 348 497.2465515136719
Krok: 349 497.22894287109375
Krok: 350 497.2117614746094
Krok: 351 497.1952819824219
Krok: 352 497.1783752441406
Krok: 353 497.1616516113281
Krok: 354 497.1450500488281
Krok: 355 497.1294860839844
Krok: 356 497.1133728027344
Krok: 357 497.0972900390625
Krok: 358 497.0816650390625
Krok: 359 497.06634521484375
Krok: 360 497.0514831542969
Krok: 361 497.03607177734375
Krok: 362 497.02099609375
Krok: 363 497.006103515625
Krok: 364 496.99188232421875
Krok: 365 496.9775390625
Krok: 366 496.9630432128906
Krok: 367 496.9488830566406
Krok: 368 496.9346618652344
Krok: 369 496.9212341308594
Krok: 370 496.9076843261719
Krok: 371 496.8940124511719
Krok: 372 496.8804931640625
Krok: 373 496.8671569824219
Krok: 374 496.85406494140625
Krok: 375 496.8416748046875
Krok: 376 496.8287048339844
Krok: 377 496.81591796875
Krok: 378 496.80322265625
Krok: 379 496.79071044921875
Krok: 380 496.7784118652344
Krok: 381 496.7665710449219
Krok: 382 496.75469970703125
Krok: 383 496.7425231933594
Krok: 384 496.7308349609375
Krok: 385 496.7190246582031
Krok: 386 496.7074279785156
Krok: 387 496.696044921875
Krok: 388 496.6848449707031
Krok: 389 496.673828125
Krok: 390 496.66259765625
Krok: 391 496.6516418457031
Krok: 392 496.6405944824219
Krok: 393 496.6297607421875
Krok: 394 496.6191711425781
Krok: 395 496.6085205078125
Krok: 396 496.59808349609375
Krok: 397 496.588134765625
Krok: 398 496.5777282714844
Krok: 399 496.56744384765625
Krok: 400 496.55743408203125
Krok: 401 496.54742431640625
Krok: 402 496.5374450683594
Krok: 403 496.5276794433594
Krok: 404 496.5179443359375
Krok: 405 496.50830078125
Krok: 406 496.4987487792969
Krok: 407 496.4894714355469
Krok: 408 496.48028564453125
Krok: 409 496.470947265625
Krok: 410 496.4617004394531
Krok: 411 496.4526672363281
Krok: 412 496.44354248046875
Krok: 413 496.4345703125
Krok: 414 496.42572021484375
Krok: 415 496.4168701171875
Krok: 416 496.408203125
Krok: 417 496.39959716796875
Krok: 418 496.3909606933594
Krok: 419 496.3825378417969
Krok: 420 496.37396240234375
Krok: 421 496.3656921386719
Krok: 422 496.3574523925781
Krok: 423 496.3490905761719
Krok: 424 496.3411560058594
Krok: 425 496.3331604003906
Krok: 426 496.324951171875
Krok: 427 496.31707763671875
Krok: 428 496.30902099609375
Krok: 429 496.30126953125
Krok: 430 496.29351806640625
Krok: 431 496.28582763671875
Krok: 432 496.278076171875
Krok: 433 496.2705383300781
Krok: 434 496.2630310058594
Krok: 435 496.2555847167969
Krok: 436 496.24810791015625
Krok: 437 496.2406921386719
Krok: 438 496.23333740234375
Krok: 439 496.2261047363281
Krok: 440 496.21893310546875
Krok: 441 496.21173095703125
Krok: 442 496.20465087890625
Krok: 443 496.1975402832031
Krok: 444 496.19049072265625
Krok: 445 496.1835632324219
Krok: 446 496.17669677734375
Krok: 447 496.16973876953125
Krok: 448 496.1629638671875
Krok: 449 496.1562805175781
Krok: 450 496.1496276855469
Krok: 451 496.1428527832031
Krok: 452 496.1361999511719
Krok: 453 496.1297607421875
Krok: 454 496.1232604980469
Krok: 455 496.1168518066406
Krok: 456 496.11053466796875
Krok: 457 496.1039123535156
Krok: 458 496.0974426269531
Krok: 459 496.091064453125
Krok: 460 496.08477783203125
Krok: 461 496.07843017578125
Krok: 462 496.07220458984375
Krok: 463 496.06591796875
Krok: 464 496.05975341796875
Krok: 465 496.0535583496094
Krok: 466 496.04742431640625
Krok: 467 496.04132080078125
Krok: 468 496.03509521484375
Krok: 469 496.0291748046875
Krok: 470 496.0231628417969
Krok: 471 496.0171813964844
Krok: 472 496.0111083984375
Krok: 473 496.0052185058594
Krok: 474 495.9993591308594
Krok: 475 495.9936218261719
Krok: 476 495.9876708984375
Krok: 477 495.9817810058594
Krok: 478 495.97607421875
Krok: 479 495.9698791503906
Krok: 480 495.9638366699219
Krok: 481 495.9576721191406
Krok: 482 495.95166015625
Krok: 483 495.94561767578125
Krok: 484 495.9395751953125
Krok: 485 495.93359375
Krok: 486 495.927734375
Krok: 487 495.9217224121094
Krok: 488 495.91571044921875
Krok: 489 495.9099426269531
Krok: 490 495.9040222167969
Krok: 491 495.8981628417969
Krok: 492 495.8923034667969
Krok: 493 495.8865966796875
Krok: 494 495.8806457519531
Krok: 495 495.8748779296875
Krok: 496 495.86920166015625
Krok: 497 495.8633117675781
Krok: 498 495.857666015625
Krok: 499 495.8519592285156
Krok: 500 495.84625244140625
Krok: 501 495.840576171875
Krok: 502 495.8348388671875
Krok: 503 495.829345703125
Krok: 504 495.8236083984375
Krok: 505 495.8171691894531
Krok: 506 495.8107604980469
Krok: 507 495.8043212890625
Krok: 508 495.7979736328125
Krok: 509 495.7915954589844
Krok: 510 495.785400390625
Krok: 511 495.7790222167969
Krok: 512 495.7726135253906
Krok: 513 495.76641845703125
Krok: 514 495.76019287109375
Krok: 515 495.7541198730469
Krok: 516 495.748046875
Krok: 517 495.7419738769531
Krok: 518 495.73583984375
Krok: 519 495.729736328125
Krok: 520 495.7236633300781
Krok: 521 495.7175598144531
Krok: 522 495.71148681640625
Krok: 523 495.7054138183594
Krok: 524 495.699462890625
Krok: 525 495.6934509277344
Krok: 526 495.6874084472656
Krok: 527 495.6813659667969
Krok: 528 495.6753234863281
Krok: 529 495.6687316894531
Krok: 530 495.66204833984375
Krok: 531 495.6550598144531
Krok: 532 495.64727783203125
Krok: 533 495.6394958496094
Krok: 534 495.63165283203125
Krok: 535 495.6239013671875
Krok: 536 495.6160583496094
Krok: 537 495.608154296875
Krok: 538 495.6003723144531
Krok: 539 495.5926513671875
Krok: 540 495.5844421386719
Krok: 541 495.57550048828125
Krok: 542 495.5664367675781
Krok: 543 495.55755615234375
Krok: 544 495.5486755371094
Krok: 545 495.5397644042969
Krok: 546 495.5308837890625
Krok: 547 495.5218505859375
Krok: 548 495.5130615234375
Krok: 549 495.504150390625
Krok: 550 495.4952392578125
Krok: 551 495.486328125
Krok: 552 495.4774169921875
Krok: 553 495.4685363769531
Krok: 554 495.4596862792969
Krok: 555 495.4508056640625
Krok: 556 495.44195556640625
Krok: 557 495.4330139160156
Krok: 558 495.4241027832031
Krok: 559 495.415283203125
Krok: 560 495.406494140625
Krok: 561 495.3975830078125
Krok: 562 495.3887634277344
Krok: 563 495.3799133300781
Krok: 564 495.3710021972656
Krok: 565 495.36248779296875
Krok: 566 495.3529052734375
Krok: 567 495.34246826171875
Krok: 568 495.3321228027344
Krok: 569 495.32177734375
Krok: 570 495.31146240234375
Krok: 571 495.3011169433594
Krok: 572 495.2906494140625
Krok: 573 495.2803039550781
Krok: 574 495.2700500488281
Krok: 575 495.2597961425781
Krok: 576 495.2494201660156
Krok: 577 495.2391052246094
Krok: 578 495.2287292480469
Krok: 579 495.2184143066406
Krok: 580 495.2080993652344
Krok: 581 495.1977844238281
Krok: 582 495.1876220703125
Krok: 583 495.1782531738281
Krok: 584 495.168212890625
Krok: 585 495.1573181152344
Krok: 586 495.1463623046875
Krok: 587 495.1353454589844
Krok: 588 495.1246337890625
Krok: 589 495.1140441894531
Krok: 590 495.1036376953125
Krok: 591 495.0934753417969
Krok: 592 495.08306884765625
Krok: 593 495.07281494140625
Krok: 594 495.06243896484375
Krok: 595 495.05218505859375
Krok: 596 495.0419616699219
Krok: 597 495.0315856933594
Krok: 598 495.0213928222656
Krok: 599 495.0114440917969
Krok: 600 495.00128173828125
Krok: 601 494.99029541015625
Krok: 602 494.9793701171875
Krok: 603 494.9684143066406
Krok: 604 494.9574890136719
Krok: 605 494.9465026855469
Krok: 606 494.9355163574219
Krok: 607 494.9245910644531
Krok: 608 494.9136962890625
Krok: 609 494.9026794433594
Krok: 610 494.89166259765625
Krok: 611 494.8806457519531
Krok: 612 494.86944580078125
Krok: 613 494.8571472167969
Krok: 614 494.8447570800781
Krok: 615 494.83245849609375
Krok: 616 494.8200988769531
Krok: 617 494.80767822265625
Krok: 618 494.79541015625
Krok: 619 494.7828674316406
Krok: 620 494.7705383300781
Krok: 621 494.7581481933594
Krok: 622 494.7458801269531
Krok: 623 494.733642578125
Krok: 624 494.72113037109375
Krok: 625 494.7088623046875
Krok: 626 494.6964416503906
Krok: 627 494.68408203125
Krok: 628 494.6717224121094
Krok: 629 494.6593933105469
Krok: 630 494.64703369140625
Krok: 631 494.6347351074219
Krok: 632 494.6224365234375
Krok: 633 494.6099853515625
Krok: 634 494.59765625
Krok: 635 494.5852355957031
Krok: 636 494.5729675292969
Krok: 637 494.560546875
Krok: 638 494.5481872558594
Krok: 639 494.5362548828125
Krok: 640 494.5246276855469
Krok: 641 494.5129089355469
Krok: 642 494.5008850097656
Krok: 643 494.4888610839844
Krok: 644 494.4768981933594
Krok: 645 494.4649658203125
Krok: 646 494.452880859375
Krok: 647 494.4407958984375
Krok: 648 494.42877197265625
Krok: 649 494.4146423339844
Krok: 650 494.39971923828125
Krok: 651 494.3851318359375
Krok: 652 494.37054443359375
Krok: 653 494.3558654785156
Krok: 654 494.34130859375
Krok: 655 494.3265380859375
Krok: 656 494.3108215332031
Krok: 657 494.294921875
Krok: 658 494.27911376953125
Krok: 659 494.2633056640625
Krok: 660 494.2484130859375
Krok: 661 494.23431396484375
Krok: 662 494.2203063964844
Krok: 663 494.20654296875
Krok: 664 494.1925964355469
Krok: 665 494.1784973144531
Krok: 666 494.1643981933594
Krok: 667 494.15032958984375
Krok: 668 494.1363525390625
Krok: 669 494.122314453125
Krok: 670 494.1085205078125
Krok: 671 494.09454345703125
Krok: 672 494.08050537109375
Krok: 673 494.0664978027344
Krok: 674 494.052490234375
Krok: 675 494.03839111328125
Krok: 676 494.02435302734375
Krok: 677 494.01025390625
Krok: 678 493.996337890625
Krok: 679 493.9825744628906
Krok: 680 493.96844482421875
Krok: 681 493.9544372558594
Krok: 682 493.94024658203125
Krok: 683 493.92626953125
Krok: 684 493.9122619628906
Krok: 685 493.89813232421875
Krok: 686 493.8841247558594
Krok: 687 493.8699645996094
Krok: 688 493.8559265136719
Krok: 689 493.8419494628906
Krok: 690 493.82806396484375
Krok: 691 493.81390380859375
Krok: 692 493.7998046875
Krok: 693 493.7856750488281
Krok: 694 493.7715759277344
Krok: 695 493.75750732421875
Krok: 696 493.743408203125
Krok: 697 493.7292785644531
Krok: 698 493.7151184082031
Krok: 699 493.7005615234375
Krok: 700 493.68603515625
Krok: 701 493.67083740234375
Krok: 702 493.6536560058594
Krok: 703 493.63751220703125
Krok: 704 493.6200256347656
Krok: 705 493.6037292480469
Krok: 706 493.5868225097656
Krok: 707 493.5699157714844
Krok: 708 493.5523376464844
Krok: 709 493.5345764160156
Krok: 710 493.5166320800781
Krok: 711 493.4981994628906
Krok: 712 493.480712890625
Krok: 713 493.4623107910156
Krok: 714 493.4440612792969
Krok: 715 493.42620849609375
Krok: 716 493.4084777832031
Krok: 717 493.39013671875
Krok: 718 493.3719482421875
Krok: 719 493.3538513183594
Krok: 720 493.3360290527344
Krok: 721 493.3177185058594
Krok: 722 493.2994384765625
Krok: 723 493.2812194824219
Krok: 724 493.26361083984375
Krok: 725 493.2452392578125
Krok: 726 493.2268371582031
Krok: 727 493.20855712890625
Krok: 728 493.1907043457031
Krok: 729 493.1725158691406
Krok: 730 493.1540832519531
Krok: 731 493.13568115234375
Krok: 732 493.11749267578125
Krok: 733 493.0996398925781
Krok: 734 493.08111572265625
Krok: 735 493.0626525878906
Krok: 736 493.0442810058594
Krok: 737 493.0262756347656
Krok: 738 493.0081787109375
Krok: 739 492.9896240234375
Krok: 740 492.9710998535156
Krok: 741 492.95263671875
Krok: 742 492.9345703125
Krok: 743 492.9162902832031
Krok: 744 492.8976745605469
Krok: 745 492.87921142578125
Krok: 746 492.8606872558594
Krok: 747 492.84246826171875
Krok: 748 492.82421875
Krok: 749 492.8052673339844
Krok: 750 492.7855529785156
Krok: 751 492.7666015625
Krok: 752 492.74737548828125
Krok: 753 492.72760009765625
Krok: 754 492.70892333984375
Krok: 755 492.6890563964844
Krok: 756 492.66973876953125
Krok: 757 492.65069580078125
Krok: 758 492.63067626953125
Krok: 759 492.6117858886719
Krok: 760 492.59234619140625
Krok: 761 492.57293701171875
Krok: 762 492.55438232421875
Krok: 763 492.5345458984375
Krok: 764 492.51544189453125
Krok: 765 492.49658203125
Krok: 766 492.4768981933594
Krok: 767 492.4579772949219
Krok: 768 492.4387512207031
Krok: 769 492.4189758300781
Krok: 770 492.4002685546875
Krok: 771 492.3808898925781
Krok: 772 492.3611755371094
Krok: 773 492.342529296875
Krok: 774 492.32269287109375
Krok: 775 492.3028564453125
Krok: 776 492.28253173828125
Krok: 777 492.2611389160156
Krok: 778 492.2398986816406
Krok: 779 492.21905517578125
Krok: 780 492.1982727050781
Krok: 781 492.1768798828125
Krok: 782 492.1555480957031
Krok: 783 492.134521484375
Krok: 784 492.1141662597656
Krok: 785 492.0929870605469
Krok: 786 492.071533203125
Krok: 787 492.05047607421875
Krok: 788 492.0279846191406
Krok: 789 492.0050964355469
Krok: 790 491.9818420410156
Krok: 791 491.959716796875
Krok: 792 491.9361572265625
Krok: 793 491.913330078125
Krok: 794 491.890625
Krok: 795 491.8671875
Krok: 796 491.8447265625
Krok: 797 491.8215026855469
Krok: 798 491.7979431152344
Krok: 799 491.7742614746094
Krok: 800 491.7497253417969
Krok: 801 491.725830078125
Krok: 802 491.701416015625
Krok: 803 491.6774597167969
Krok: 804 491.6529235839844
Krok: 805 491.6288757324219
Krok: 806 491.6044921875
Krok: 807 491.5802001953125
Krok: 808 491.5559387207031
Krok: 809 491.53131103515625
Krok: 810 491.5062561035156
Krok: 811 491.4805603027344
Krok: 812 491.45611572265625
Krok: 813 491.4303894042969
Krok: 814 491.4051513671875
Krok: 815 491.38037109375
Krok: 816 491.35479736328125
Krok: 817 491.3298034667969
Krok: 818 491.30181884765625
Krok: 819 491.2740478515625
Krok: 820 491.24658203125
Krok: 821 491.21820068359375
Krok: 822 491.1899108886719
Krok: 823 491.16168212890625
Krok: 824 491.1336975097656
Krok: 825 491.10589599609375
Krok: 826 491.0774841308594
Krok: 827 491.0492248535156
Krok: 828 491.0208740234375
Krok: 829 490.99310302734375
Krok: 830 490.96502685546875
Krok: 831 490.9364929199219
Krok: 832 490.90814208984375
Krok: 833 490.878662109375
Krok: 834 490.84686279296875
Krok: 835 490.8150329589844
Krok: 836 490.7832946777344
Krok: 837 490.7513122558594
Krok: 838 490.71881103515625
Krok: 839 490.68536376953125
Krok: 840 490.65179443359375
Krok: 841 490.6189270019531
Krok: 842 490.5859069824219
Krok: 843 490.5520935058594
Krok: 844 490.5193786621094
Krok: 845 490.4861145019531
Krok: 846 490.4523010253906
Krok: 847 490.4194641113281
Krok: 848 490.3860168457031
Krok: 849 490.3521423339844
Krok: 850 490.31805419921875
Krok: 851 490.28277587890625
Krok: 852 490.2470397949219
Krok: 853 490.20782470703125
Krok: 854 490.1681213378906
Krok: 855 490.1269836425781
Krok: 856 490.0843811035156
Krok: 857 490.04229736328125
Krok: 858 490.00006103515625
Krok: 859 489.95794677734375
Krok: 860 489.9156799316406
Krok: 861 489.8734130859375
Krok: 862 489.8310241699219
Krok: 863 489.7887268066406
Krok: 864 489.7464294433594
Krok: 865 489.7040100097656
Krok: 866 489.6614990234375
Krok: 867 489.6189880371094
Krok: 868 489.5763854980469
Krok: 869 489.5337829589844
Krok: 870 489.48974609375
Krok: 871 489.44366455078125
Krok: 872 489.3974304199219
Krok: 873 489.35076904296875
Krok: 874 489.3030700683594
Krok: 875 489.25537109375
Krok: 876 489.2076110839844
Krok: 877 489.1597900390625
Krok: 878 489.1118469238281
Krok: 879 489.0638427734375
Krok: 880 489.0157775878906
Krok: 881 488.96771240234375
Krok: 882 488.9196472167969
Krok: 883 488.8714294433594
Krok: 884 488.8231506347656
Krok: 885 488.77490234375
Krok: 886 488.7265319824219
Krok: 887 488.6781005859375
Krok: 888 488.6292724609375
Krok: 889 488.57977294921875
Krok: 890 488.5301513671875
Krok: 891 488.48046875
Krok: 892 488.43072509765625
Krok: 893 488.38037109375
Krok: 894 488.3282775878906
Krok: 895 488.27691650390625
Krok: 896 488.22540283203125
Krok: 897 488.1739807128906
Krok: 898 488.1224670410156
Krok: 899 488.07073974609375
Krok: 900 488.01898193359375
Krok: 901 487.9672546386719
Krok: 902 487.9154052734375
Krok: 903 487.8635559082031
Krok: 904 487.8115234375
Krok: 905 487.7594909667969
Krok: 906 487.70751953125
Krok: 907 487.65545654296875
Krok: 908 487.6025390625
Krok: 909 487.5494384765625
Krok: 910 487.4964294433594
Krok: 911 487.4432373046875
Krok: 912 487.3899230957031
Krok: 913 487.3367004394531
Krok: 914 487.28271484375
Krok: 915 487.2264404296875
Krok: 916 487.17010498046875
Krok: 917 487.1136169433594
Krok: 918 487.0570373535156
Krok: 919 487.0003967285156
Krok: 920 486.9437255859375
Krok: 921 486.886962890625
Krok: 922 486.8301086425781
Krok: 923 486.7720947265625
Krok: 924 486.7126770019531
Krok: 925 486.65325927734375
Krok: 926 486.5936584472656
Krok: 927 486.5339660644531
Krok: 928 486.4743957519531
Krok: 929 486.4140930175781
Krok: 930 486.3533020019531
Krok: 931 486.2923889160156
Krok: 932 486.2312927246094
Krok: 933 486.1703186035156
Krok: 934 486.10992431640625
Krok: 935 486.04949951171875
Krok: 936 485.98895263671875
Krok: 937 485.9282531738281
Krok: 938 485.8676452636719
Krok: 939 485.8065490722656
Krok: 940 485.7420654296875
Krok: 941 485.67755126953125
Krok: 942 485.6129150390625
Krok: 943 485.54827880859375
Krok: 944 485.483154296875
Krok: 945 485.4175720214844
Krok: 946 485.35205078125
Krok: 947 485.286376953125
Krok: 948 485.2207336425781
Krok: 949 485.1543884277344
Krok: 950 485.0868835449219
Krok: 951 485.0193176269531
Krok: 952 484.95159912109375
Krok: 953 484.88397216796875
Krok: 954 484.816162109375
Krok: 955 484.7483215332031
Krok: 956 484.679931640625
Krok: 957 484.6103515625
Krok: 958 484.5412902832031
Krok: 959 484.4722595214844
Krok: 960 484.4022521972656
Krok: 961 484.3307800292969
Krok: 962 484.2591552734375
Krok: 963 484.1844482421875
Krok: 964 484.1097106933594
Krok: 965 484.034912109375
Krok: 966 483.9600830078125
Krok: 967 483.8848876953125
Krok: 968 483.8095397949219
Krok: 969 483.7324523925781
Krok: 970 483.6553039550781
Krok: 971 483.5774230957031
Krok: 972 483.4966735839844
Krok: 973 483.41290283203125
Krok: 974 483.32916259765625
Krok: 975 483.2453308105469
Krok: 976 483.1614990234375
Krok: 977 483.0775451660156
Krok: 978 482.9935607910156
Krok: 979 482.9097595214844
Krok: 980 482.8248596191406
Krok: 981 482.7398681640625
Krok: 982 482.65283203125
Krok: 983 482.5634460449219
Krok: 984 482.4739074707031
Krok: 985 482.3843078613281
Krok: 986 482.2960205078125
Krok: 987 482.2087097167969
Krok: 988 482.12042236328125
Krok: 989 482.03076171875
Krok: 990 481.9408874511719
Krok: 991 481.8509521484375
Krok: 992 481.760986328125
Krok: 993 481.6710510253906
Krok: 994 481.5809326171875
Krok: 995 481.4906921386719
Krok: 996 481.4002380371094
Krok: 997 481.3099365234375
Krok: 998 481.2193603515625
Krok: 999 481.1251220703125
Krok: 1000 481.0308837890625
Krok: 1001 480.9365234375
Krok: 1002 480.842041015625
Krok: 1003 480.7467956542969
Krok: 1004 480.6490783691406
Krok: 1005 480.5511474609375
Krok: 1006 480.45330810546875
Krok: 1007 480.3551330566406
Krok: 1008 480.2569274902344
Krok: 1009 480.15875244140625
Krok: 1010 480.0604553222656
Krok: 1011 479.9618225097656
Krok: 1012 479.86334228515625
Krok: 1013 479.76641845703125
Krok: 1014 479.666748046875
Krok: 1015 479.5663757324219
Krok: 1016 479.4664611816406
Krok: 1017 479.366455078125
Krok: 1018 479.2663879394531
Krok: 1019 479.1659851074219
Krok: 1020 479.06549072265625
Krok: 1021 478.96478271484375
Krok: 1022 478.85955810546875
Krok: 1023 478.7543029785156
Krok: 1024 478.6488037109375
Krok: 1025 478.543212890625
Krok: 1026 478.4374694824219
Krok: 1027 478.3315124511719
Krok: 1028 478.225830078125
Krok: 1029 478.1200256347656
Krok: 1030 478.0139465332031
Krok: 1031 477.9073791503906
Krok: 1032 477.79803466796875
Krok: 1033 477.6883850097656
Krok: 1034 477.5787048339844
Krok: 1035 477.4687805175781
Krok: 1036 477.3587646484375
Krok: 1037 477.2485046386719
Krok: 1038 477.1381530761719
Krok: 1039 477.02764892578125
Krok: 1040 476.9172058105469
Krok: 1041 476.8055725097656
Krok: 1042 476.6920471191406
Krok: 1043 476.57830810546875
Krok: 1044 476.46435546875
Krok: 1045 476.3492431640625
Krok: 1046 476.2344055175781
Krok: 1047 476.1192932128906
Krok: 1048 476.0041198730469
Krok: 1049 475.8909606933594
Krok: 1050 475.7852783203125
Krok: 1051 475.67938232421875
Krok: 1052 475.5733947753906
Krok: 1053 475.46728515625
Krok: 1054 475.3609924316406
Krok: 1055 475.25457763671875
Krok: 1056 475.14788818359375
Krok: 1057 475.0411071777344
Krok: 1058 474.9335632324219
Krok: 1059 474.8218688964844
Krok: 1060 474.7100524902344
Krok: 1061 474.5981140136719
Krok: 1062 474.48614501953125
Krok: 1063 474.37384033203125
Krok: 1064 474.26153564453125
Krok: 1065 474.1478271484375
Krok: 1066 474.0316162109375
Krok: 1067 473.91534423828125
Krok: 1068 473.79876708984375
Krok: 1069 473.6811218261719
Krok: 1070 473.55975341796875
Krok: 1071 473.4386901855469
Krok: 1072 473.31744384765625
Krok: 1073 473.194580078125
Krok: 1074 473.0716247558594
Krok: 1075 472.9483642578125
Krok: 1076 472.8251953125
Krok: 1077 472.70318603515625
Krok: 1078 472.5835876464844
Krok: 1079 472.4616394042969
Krok: 1080 472.33941650390625
Krok: 1081 472.2171630859375
Krok: 1082 472.09429931640625
Krok: 1083 471.97137451171875
Krok: 1084 471.84814453125
Krok: 1085 471.72705078125
Krok: 1086 471.61083984375
Krok: 1087 471.49432373046875
Krok: 1088 471.3777770996094
Krok: 1089 471.2602233886719
Krok: 1090 471.1412658691406
Krok: 1091 471.0220947265625
Krok: 1092 470.9029541015625
Krok: 1093 470.7856750488281
Krok: 1094 470.66705322265625
Krok: 1095 470.5467834472656
Krok: 1096 470.4268798828125
Krok: 1097 470.3070068359375
Krok: 1098 470.18658447265625
Krok: 1099 470.06512451171875
Krok: 1100 469.9402770996094
Krok: 1101 469.81658935546875
Krok: 1102 469.6971740722656
Krok: 1103 469.572265625
Krok: 1104 469.44921875
Krok: 1105 469.32647705078125
Krok: 1106 469.20257568359375
Krok: 1107 469.0765686035156
Krok: 1108 468.9506530761719
Krok: 1109 468.82366943359375
Krok: 1110 468.6935729980469
Krok: 1111 468.5633850097656
Krok: 1112 468.4330749511719
Krok: 1113 468.3025817871094
Krok: 1114 468.1719970703125
Krok: 1115 468.0416259765625
Krok: 1116 467.91180419921875
Krok: 1117 467.78192138671875
Krok: 1118 467.65191650390625
Krok: 1119 467.5244445800781
Krok: 1120 467.401123046875
Krok: 1121 467.2775573730469
Krok: 1122 467.15399169921875
Krok: 1123 467.0303039550781
Krok: 1124 466.90692138671875
Krok: 1125 466.78338623046875
Krok: 1126 466.6597900390625
Krok: 1127 466.53607177734375
Krok: 1128 466.41229248046875
Krok: 1129 466.288330078125
Krok: 1130 466.16436767578125
Krok: 1131 466.0402526855469
Krok: 1132 465.91595458984375
Krok: 1133 465.7915954589844
Krok: 1134 465.66748046875
Krok: 1135 465.5442199707031
Krok: 1136 465.42083740234375
Krok: 1137 465.2973937988281
Krok: 1138 465.1737060546875
Krok: 1139 465.0497131347656
Krok: 1140 464.9227294921875
Krok: 1141 464.7961120605469
Krok: 1142 464.6697998046875
Krok: 1143 464.5435485839844
Krok: 1144 464.41717529296875
Krok: 1145 464.2906799316406
Krok: 1146 464.1641540527344
Krok: 1147 464.0375061035156
Krok: 1148 463.9107360839844
Krok: 1149 463.7839050292969
Krok: 1150 463.65631103515625
Krok: 1151 463.5265197753906
Krok: 1152 463.3962097167969
Krok: 1153 463.2605895996094
Krok: 1154 463.1250305175781
Krok: 1155 462.9896545410156
Krok: 1156 462.85406494140625
Krok: 1157 462.7181701660156
Krok: 1158 462.58233642578125
Krok: 1159 462.4465026855469
Krok: 1160 462.31219482421875
Krok: 1161 462.1777648925781
Krok: 1162 462.04107666015625
Krok: 1163 461.90423583984375
Krok: 1164 461.76751708984375
Krok: 1165 461.62982177734375
Krok: 1166 461.48980712890625
Krok: 1167 461.35089111328125
Krok: 1168 461.2161865234375
Krok: 1169 461.074951171875
Krok: 1170 460.9392395019531
Krok: 1171 460.80389404296875
Krok: 1172 460.6662292480469
Krok: 1173 460.5266418457031
Krok: 1174 460.3916320800781
Krok: 1175 460.2496337890625
Krok: 1176 460.109619140625
Krok: 1177 459.97271728515625
Krok: 1178 459.82159423828125
Krok: 1179 459.67816162109375
Krok: 1180 459.53302001953125
Krok: 1181 459.38348388671875
Krok: 1182 459.2380676269531
Krok: 1183 459.0912170410156
Krok: 1184 458.9490966796875
Krok: 1185 458.7962341308594
Krok: 1186 458.6504211425781
Krok: 1187 458.50628662109375
Krok: 1188 458.3661193847656
Krok: 1189 458.2148742675781
Krok: 1190 458.0690002441406
Krok: 1191 457.9233093261719
Krok: 1192 457.7774963378906
Krok: 1193 457.6328430175781
Krok: 1194 457.48980712890625
Krok: 1195 457.3409423828125
Krok: 1196 457.1944580078125
Krok: 1197 457.0484619140625
Krok: 1198 456.9024353027344
Krok: 1199 456.7566833496094
Krok: 1200 456.612060546875
Krok: 1201 456.4642028808594
Krok: 1202 456.31640625
Krok: 1203 456.16876220703125
Krok: 1204 456.02099609375
Krok: 1205 455.8732604980469
Krok: 1206 455.72552490234375
Krok: 1207 455.58099365234375
Krok: 1208 455.431884765625
Krok: 1209 455.2840270996094
Krok: 1210 455.135986328125
Krok: 1211 454.98809814453125
Krok: 1212 454.8402099609375
Krok: 1213 454.6922607421875
Krok: 1214 454.54571533203125
Krok: 1215 454.399658203125
Krok: 1216 454.25048828125
Krok: 1217 454.1022644042969
Krok: 1218 453.9543762207031
Krok: 1219 453.80633544921875
Krok: 1220 453.6584777832031
Krok: 1221 453.5111083984375
Krok: 1222 453.3662109375
Krok: 1223 453.2212829589844
Krok: 1224 453.0763244628906
Krok: 1225 452.9314880371094
Krok: 1226 452.7863464355469
Krok: 1227 452.6417541503906
Krok: 1228 452.4967041015625
Krok: 1229 452.3493957519531
Krok: 1230 452.20208740234375
Krok: 1231 452.05035400390625
Krok: 1232 451.90203857421875
Krok: 1233 451.7559509277344
Krok: 1234 451.6043395996094
Krok: 1235 451.4522705078125
Krok: 1236 451.3026428222656
Krok: 1237 451.1457214355469
Krok: 1238 450.9977722167969
Krok: 1239 450.8420104980469
Krok: 1240 450.6920166015625
Krok: 1241 450.541748046875
Krok: 1242 450.38751220703125
Krok: 1243 450.2406005859375
Krok: 1244 450.08526611328125
Krok: 1245 449.9320983886719
Krok: 1246 449.77783203125
Krok: 1247 449.6208801269531
Krok: 1248 449.46380615234375
Krok: 1249 449.3087158203125
Krok: 1250 449.14984130859375
Krok: 1251 448.99542236328125
Krok: 1252 448.83343505859375
Krok: 1253 448.6788635253906
Krok: 1254 448.5207214355469
Krok: 1255 448.3619384765625
Krok: 1256 448.2071838378906
Krok: 1257 448.0439453125
Krok: 1258 447.8864440917969
Krok: 1259 447.72235107421875
Krok: 1260 447.55865478515625
Krok: 1261 447.3957214355469
Krok: 1262 447.2259521484375
Krok: 1263 447.06097412109375
Krok: 1264 446.8923645019531
Krok: 1265 446.7264099121094
Krok: 1266 446.5587463378906
Krok: 1267 446.39208984375
Krok: 1268 446.22509765625
Krok: 1269 446.0582580566406
Krok: 1270 445.8891906738281
Krok: 1271 445.71612548828125
Krok: 1272 445.5459899902344
Krok: 1273 445.3675842285156
Krok: 1274 445.1908874511719
Krok: 1275 445.0231628417969
Krok: 1276 444.8462829589844
Krok: 1277 444.6805419921875
Krok: 1278 444.51055908203125
Krok: 1279 444.34588623046875
Krok: 1280 444.1695251464844
Krok: 1281 444.00128173828125
Krok: 1282 443.8263244628906
Krok: 1283 443.65020751953125
Krok: 1284 443.4792175292969
Krok: 1285 443.3080139160156
Krok: 1286 443.13201904296875
Krok: 1287 442.9596252441406
Krok: 1288 442.7904052734375
Krok: 1289 442.61480712890625
Krok: 1290 442.4412841796875
Krok: 1291 442.27374267578125
Krok: 1292 442.0979919433594
Krok: 1293 441.92462158203125
Krok: 1294 441.7575988769531
Krok: 1295 441.58221435546875
Krok: 1296 441.4092712402344
Krok: 1297 441.24249267578125
Krok: 1298 441.0672607421875
Krok: 1299 440.8953857421875
Krok: 1300 440.7261047363281
Krok: 1301 440.5501403808594
Krok: 1302 440.37457275390625
Krok: 1303 440.1982421875
Krok: 1304 440.0249938964844
Krok: 1305 439.8465576171875
Krok: 1306 439.6718444824219
Krok: 1307 439.4973449707031
Krok: 1308 439.32586669921875
Krok: 1309 439.1556396484375
Krok: 1310 438.9798278808594
Krok: 1311 438.8056640625
Krok: 1312 438.6366271972656
Krok: 1313 438.46636962890625
Krok: 1314 438.29022216796875
Krok: 1315 438.1219482421875
Krok: 1316 437.9512634277344
Krok: 1317 437.7763977050781
Krok: 1318 437.60992431640625
Krok: 1319 437.4346008300781
Krok: 1320 437.2607421875
Krok: 1321 437.08905029296875
Krok: 1322 436.9193420410156
Krok: 1323 436.74066162109375
Krok: 1324 436.5626525878906
Krok: 1325 436.38409423828125
Krok: 1326 436.2055358886719
Krok: 1327 436.0271911621094
Krok: 1328 435.8490295410156
Krok: 1329 435.671142578125
Krok: 1330 435.4933166503906
Krok: 1331 435.3156433105469
Krok: 1332 435.13824462890625
Krok: 1333 434.9610900878906
Krok: 1334 434.78411865234375
Krok: 1335 434.60638427734375
Krok: 1336 434.4273681640625
Krok: 1337 434.2487487792969
Krok: 1338 434.0702819824219
Krok: 1339 433.89215087890625
Krok: 1340 433.7140808105469
Krok: 1341 433.5363464355469
Krok: 1342 433.35888671875
Krok: 1343 433.181640625
Krok: 1344 433.00518798828125
Krok: 1345 432.8236083984375
Krok: 1346 432.6455383300781
Krok: 1347 432.47247314453125
Krok: 1348 432.2899169921875
Krok: 1349 432.1164855957031
Krok: 1350 431.9394226074219
Krok: 1351 431.7554626464844
Krok: 1352 431.5753173828125
Krok: 1353 431.3968505859375
Krok: 1354 431.21270751953125
Krok: 1355 431.0304260253906
Krok: 1356 430.8503723144531
Krok: 1357 430.6741027832031
Krok: 1358 430.4906005859375
Krok: 1359 430.3091125488281
Krok: 1360 430.1306457519531
Krok: 1361 429.9550476074219
Krok: 1362 429.7721862792969
Krok: 1363 429.59161376953125
Krok: 1364 429.4159240722656
Krok: 1365 429.23956298828125
Krok: 1366 429.05767822265625
Krok: 1367 428.879150390625
Krok: 1368 428.7052001953125
Krok: 1369 428.5246276855469
Krok: 1370 428.3453063964844
Krok: 1371 428.16644287109375
Krok: 1372 427.98797607421875
Krok: 1373 427.8097229003906
Krok: 1374 427.63177490234375
Krok: 1375 427.4539794921875
Krok: 1376 427.2767333984375
Krok: 1377 427.1061706542969
Krok: 1378 426.92669677734375
Krok: 1379 426.7495422363281
Krok: 1380 426.5738525390625
Krok: 1381 426.4057922363281
Krok: 1382 426.2266540527344
Krok: 1383 426.0504455566406
Krok: 1384 425.8748474121094
Krok: 1385 425.70391845703125
Krok: 1386 425.5326232910156
Krok: 1387 425.3556823730469
Krok: 1388 425.1812438964844
Krok: 1389 425.0130310058594
Krok: 1390 424.84222412109375
Krok: 1391 424.66619873046875
Krok: 1392 424.49810791015625
Krok: 1393 424.32928466796875
Krok: 1394 424.15625
Krok: 1395 423.99224853515625
Krok: 1396 423.8168640136719
Krok: 1397 423.65032958984375
Krok: 1398 423.4825134277344
Krok: 1399 423.3132019042969
Krok: 1400 423.1478576660156
Krok: 1401 422.974853515625
Krok: 1402 422.812744140625
Krok: 1403 422.6391906738281
Krok: 1404 422.4740905761719
Krok: 1405 422.3086853027344
Krok: 1406 422.1388244628906
Krok: 1407 421.9781188964844
Krok: 1408 421.80548095703125
Krok: 1409 421.64251708984375
Krok: 1410 421.478515625
Krok: 1411 421.3119812011719
Krok: 1412 421.1516418457031
Krok: 1413 420.98248291015625
Krok: 1414 420.8288269042969
Krok: 1415 420.660888671875
Krok: 1416 420.5047607421875
Krok: 1417 420.3443298339844
Krok: 1418 420.1875915527344
Krok: 1419 420.0422668457031
Krok: 1420 419.879150390625
Krok: 1421 419.7279052734375
Krok: 1422 419.578857421875
Krok: 1423 419.4303283691406
Krok: 1424 419.2823791503906
Krok: 1425 419.1345520019531
Krok: 1426 418.98748779296875
Krok: 1427 418.8412170410156
Krok: 1428 418.69561767578125
Krok: 1429 418.5502624511719
Krok: 1430 418.4059753417969
Krok: 1431 418.26251220703125
Krok: 1432 418.11956787109375
Krok: 1433 417.9772033691406
Krok: 1434 417.8367919921875
Krok: 1435 417.69677734375
Krok: 1436 417.55712890625
Krok: 1437 417.41851806640625
Krok: 1438 417.2805480957031
Krok: 1439 417.1432800292969
Krok: 1440 417.006591796875
Krok: 1441 416.8704833984375
Krok: 1442 416.7347717285156
Krok: 1443 416.59967041015625
Krok: 1444 416.465087890625
Krok: 1445 416.3309631347656
Krok: 1446 416.19732666015625
Krok: 1447 416.0639343261719
Krok: 1448 415.9312744140625
Krok: 1449 415.79888916015625
Krok: 1450 415.6669006347656
Krok: 1451 415.53546142578125
Krok: 1452 415.40435791015625
Krok: 1453 415.2738037109375
Krok: 1454 415.14349365234375
Krok: 1455 415.0138244628906
Krok: 1456 414.8844299316406
Krok: 1457 414.75555419921875
Krok: 1458 414.6271667480469
Krok: 1459 414.4987487792969
Krok: 1460 414.37066650390625
Krok: 1461 414.24310302734375
Krok: 1462 414.1160888671875
Krok: 1463 413.989501953125
Krok: 1464 413.86322021484375
Krok: 1465 413.7375183105469
Krok: 1466 413.61212158203125
Krok: 1467 413.4873962402344
Krok: 1468 413.3629150390625
Krok: 1469 413.239013671875
Krok: 1470 413.1153869628906
Krok: 1471 412.9922790527344
Krok: 1472 412.86956787109375
Krok: 1473 412.7473449707031
Krok: 1474 412.6253662109375
Krok: 1475 412.504150390625
Krok: 1476 412.3830261230469
Krok: 1477 412.2624816894531
Krok: 1478 412.1423034667969
Krok: 1479 412.0226745605469
Krok: 1480 411.9034423828125
Krok: 1481 411.7845764160156
Krok: 1482 411.6662292480469
Krok: 1483 411.5482177734375
Krok: 1484 411.4306640625
Krok: 1485 411.3135070800781
Krok: 1486 411.19683837890625
Krok: 1487 411.08074951171875
Krok: 1488 410.9649658203125
Krok: 1489 410.84954833984375
Krok: 1490 410.7337951660156
Krok: 1491 410.6165466308594
Krok: 1492 410.49993896484375
Krok: 1493 410.3837585449219
Krok: 1494 410.26800537109375
Krok: 1495 410.15325927734375
Krok: 1496 410.0393371582031
Krok: 1497 409.9227600097656
Krok: 1498 409.8067321777344
Krok: 1499 409.6913146972656
Krok: 1500 409.5762939453125
Krok: 1501 409.4617919921875
Krok: 1502 409.34979248046875
Krok: 1503 409.2416076660156
Krok: 1504 409.1305847167969
Krok: 1505 409.022705078125
Krok: 1506 408.9147644042969
Krok: 1507 408.8049621582031
Krok: 1508 408.6981506347656
Krok: 1509 408.592529296875
Krok: 1510 408.48785400390625
Krok: 1511 408.3835144042969
Krok: 1512 408.2793884277344
Krok: 1513 408.1773986816406
Krok: 1514 408.0750427246094
Krok: 1515 407.9735412597656
Krok: 1516 407.8729248046875
Krok: 1517 407.77252197265625
Krok: 1518 407.67291259765625
Krok: 1519 407.5738220214844
Krok: 1520 407.47509765625
Krok: 1521 407.37689208984375
Krok: 1522 407.2792053222656
Krok: 1523 407.18182373046875
Krok: 1524 407.084716796875
Krok: 1525 406.98828125
Krok: 1526 406.89208984375
Krok: 1527 406.7963562011719
Krok: 1528 406.7010192871094
Krok: 1529 406.60614013671875
Krok: 1530 406.51153564453125
Krok: 1531 406.41741943359375
Krok: 1532 406.3237609863281
Krok: 1533 406.2304992675781
Krok: 1534 406.1375427246094
Krok: 1535 406.0450439453125
Krok: 1536 405.9529113769531
Krok: 1537 405.8611755371094
Krok: 1538 405.7697448730469
Krok: 1539 405.67889404296875
Krok: 1540 405.5886535644531
Krok: 1541 405.4988708496094
Krok: 1542 405.40966796875
Krok: 1543 405.3208923339844
Krok: 1544 405.2324523925781
Krok: 1545 405.1455383300781
Krok: 1546 405.05902099609375
Krok: 1547 404.9730224609375
Krok: 1548 404.88385009765625
Krok: 1549 404.79449462890625
Krok: 1550 404.7056579589844
Krok: 1551 404.617431640625
Krok: 1552 404.5296630859375
Krok: 1553 404.44219970703125
Krok: 1554 404.3553161621094
Krok: 1555 404.26898193359375
Krok: 1556 404.1875
Krok: 1557 404.099853515625
Krok: 1558 404.014404296875
Krok: 1559 403.9293518066406
Krok: 1560 403.84490966796875
Krok: 1561 403.76239013671875
Krok: 1562 403.6770935058594
Krok: 1563 403.5902404785156
Krok: 1564 403.5043029785156
Krok: 1565 403.41912841796875
Krok: 1566 403.3345947265625
Krok: 1567 403.2501525878906
Krok: 1568 403.16845703125
Krok: 1569 403.0890197753906
Krok: 1570 403.00360107421875
Krok: 1571 402.9202575683594
Krok: 1572 402.84136962890625
Krok: 1573 402.76263427734375
Krok: 1574 402.6783447265625
Krok: 1575 402.598388671875
Krok: 1576 402.5224914550781
Krok: 1577 402.439208984375
Krok: 1578 402.3613586425781
Krok: 1579 402.2853698730469
Krok: 1580 402.20318603515625
Krok: 1581 402.1256408691406
Krok: 1582 402.0504455566406
Krok: 1583 401.96875
Krok: 1584 401.892333984375
Krok: 1585 401.818115234375
Krok: 1586 401.7377624511719
Krok: 1587 401.66790771484375
Krok: 1588 401.5866394042969
Krok: 1589 401.51068115234375
Krok: 1590 401.43896484375
Krok: 1591 401.3601989746094
Krok: 1592 401.2861022949219
Krok: 1593 401.21307373046875
Krok: 1594 401.1416015625
Krok: 1595 401.07598876953125
Krok: 1596 400.9981384277344
Krok: 1597 400.9318542480469
Krok: 1598 400.8622131347656
Krok: 1599 400.7918395996094
Krok: 1600 400.72802734375
Krok: 1601 400.6554870605469
Krok: 1602 400.58758544921875
Krok: 1603 400.5256042480469
Krok: 1604 400.45538330078125
Krok: 1605 400.38677978515625
Krok: 1606 400.31976318359375
Krok: 1607 400.25079345703125
Krok: 1608 400.1815490722656
Krok: 1609 400.1127014160156
Krok: 1610 400.0436096191406
Krok: 1611 399.9768371582031
Krok: 1612 399.9053955078125
Krok: 1613 399.8413391113281
Krok: 1614 399.76898193359375
Krok: 1615 399.7060852050781
Krok: 1616 399.6327209472656
Krok: 1617 399.5654296875
Krok: 1618 399.4971618652344
Krok: 1619 399.4258117675781
Krok: 1620 399.35565185546875
Krok: 1621 399.2881164550781
Krok: 1622 399.218505859375
Krok: 1623 399.1468200683594
Krok: 1624 399.07684326171875
Krok: 1625 399.00726318359375
Krok: 1626 398.9374694824219
Krok: 1627 398.865478515625
Krok: 1628 398.7940979003906
Krok: 1629 398.7232971191406
Krok: 1630 398.6530456542969
Krok: 1631 398.5832214355469
Krok: 1632 398.5137023925781
Krok: 1633 398.44476318359375
Krok: 1634 398.3760986328125
Krok: 1635 398.3079833984375
Krok: 1636 398.239990234375
Krok: 1637 398.1725769042969
Krok: 1638 398.1053771972656
Krok: 1639 398.03857421875
Krok: 1640 397.97210693359375
Krok: 1641 397.90606689453125
Krok: 1642 397.84033203125
Krok: 1643 397.77490234375
Krok: 1644 397.7099609375
Krok: 1645 397.6451110839844
Krok: 1646 397.58074951171875
Krok: 1647 397.5167236328125
Krok: 1648 397.4529724121094
Krok: 1649 397.3896179199219
Krok: 1650 397.3265075683594
Krok: 1651 397.263671875
Krok: 1652 397.2012939453125
Krok: 1653 397.13934326171875
Krok: 1654 397.0774841308594
Krok: 1655 397.0159912109375
Krok: 1656 396.95489501953125
Krok: 1657 396.8940124511719
Krok: 1658 396.83349609375
Krok: 1659 396.77325439453125
Krok: 1660 396.7133483886719
Krok: 1661 396.65185546875
Krok: 1662 396.5863037109375
Krok: 1663 396.5218200683594
Krok: 1664 396.45794677734375
Krok: 1665 396.39471435546875
Krok: 1666 396.33209228515625
Krok: 1667 396.2696228027344
Krok: 1668 396.2075500488281
Krok: 1669 396.14599609375
Krok: 1670 396.0846862792969
Krok: 1671 396.0238952636719
Krok: 1672 395.9635009765625
Krok: 1673 395.9034118652344
Krok: 1674 395.8436279296875
Krok: 1675 395.7841491699219
Krok: 1676 395.72491455078125
Krok: 1677 395.6662292480469
Krok: 1678 395.6078796386719
Krok: 1679 395.5498352050781
Krok: 1680 395.49200439453125
Krok: 1681 395.43450927734375
Krok: 1682 395.377197265625
Krok: 1683 395.3204040527344
Krok: 1684 395.2638854980469
Krok: 1685 395.2075500488281
Krok: 1686 395.1493225097656
Krok: 1687 395.07891845703125
Krok: 1688 395.0058898925781
Krok: 1689 394.9415588378906
Krok: 1690 394.8778991699219
Krok: 1691 394.81494140625
Krok: 1692 394.75201416015625
Krok: 1693 394.6895446777344
Krok: 1694 394.62744140625
Krok: 1695 394.565673828125
Krok: 1696 394.50238037109375
Krok: 1697 394.43536376953125
Krok: 1698 394.3691711425781
Krok: 1699 394.30389404296875
Krok: 1700 394.2392272949219
Krok: 1701 394.1752624511719
Krok: 1702 394.11126708984375
Krok: 1703 394.0472106933594
Krok: 1704 393.983642578125
Krok: 1705 393.9206848144531
Krok: 1706 393.8580017089844
Krok: 1707 393.79583740234375
Krok: 1708 393.7339172363281
Krok: 1709 393.6725769042969
Krok: 1710 393.6114196777344
Krok: 1711 393.55078125
Krok: 1712 393.4903259277344
Krok: 1713 393.4302062988281
Krok: 1714 393.37054443359375
Krok: 1715 393.3111572265625
Krok: 1716 393.2520446777344
Krok: 1717 393.1934509277344
Krok: 1718 393.1350402832031
Krok: 1719 393.0769958496094
Krok: 1720 393.0191955566406
Krok: 1721 392.9617614746094
Krok: 1722 392.9045104980469
Krok: 1723 392.8477783203125
Krok: 1724 392.791259765625
Krok: 1725 392.7350769042969
Krok: 1726 392.6791687011719
Krok: 1727 392.6234436035156
Krok: 1728 392.565185546875
Krok: 1729 392.5074462890625
Krok: 1730 392.44989013671875
Krok: 1731 392.3930969238281
Krok: 1732 392.3373107910156
Krok: 1733 392.2809753417969
Krok: 1734 392.22344970703125
Krok: 1735 392.16650390625
Krok: 1736 392.1098937988281
Krok: 1737 392.0536804199219
Krok: 1738 391.9978942871094
Krok: 1739 391.9424133300781
Krok: 1740 391.8872375488281
Krok: 1741 391.8326416015625
Krok: 1742 391.77825927734375
Krok: 1743 391.723876953125
Krok: 1744 391.66961669921875
Krok: 1745 391.6155700683594
Krok: 1746 391.5616760253906
Krok: 1747 391.5081481933594
Krok: 1748 391.454833984375
Krok: 1749 391.4017333984375
Krok: 1750 391.34893798828125
Krok: 1751 391.2966613769531
Krok: 1752 391.24432373046875
Krok: 1753 391.19232177734375
Krok: 1754 391.1405334472656
Krok: 1755 391.089111328125
Krok: 1756 391.03790283203125
Krok: 1757 390.9869079589844
Krok: 1758 390.9362487792969
Krok: 1759 390.8857421875
Krok: 1760 390.8355712890625
Krok: 1761 390.78558349609375
Krok: 1762 390.73590087890625
Krok: 1763 390.6863708496094
Krok: 1764 390.6370544433594
Krok: 1765 390.588134765625
Krok: 1766 390.5394287109375
Krok: 1767 390.4908447265625
Krok: 1768 390.4425048828125
Krok: 1769 390.39434814453125
Krok: 1770 390.3466491699219
Krok: 1771 390.2989501953125
Krok: 1772 390.2515869140625
Krok: 1773 390.2045593261719
Krok: 1774 390.15765380859375
Krok: 1775 390.1108703613281
Krok: 1776 390.0644836425781
Krok: 1777 390.0182800292969
Krok: 1778 389.9722900390625
Krok: 1779 389.9263916015625
Krok: 1780 389.8808898925781
Krok: 1781 389.8354797363281
Krok: 1782 389.7904052734375
Krok: 1783 389.74542236328125
Krok: 1784 389.70074462890625
Krok: 1785 389.65625
Krok: 1786 389.6120300292969
Krok: 1787 389.56793212890625
Krok: 1788 389.52398681640625
Krok: 1789 389.4803771972656
Krok: 1790 389.43701171875
Krok: 1791 389.39373779296875
Krok: 1792 389.3522033691406
Krok: 1793 389.3125305175781
Krok: 1794 389.2677917480469
Krok: 1795 389.2250061035156
Krok: 1796 389.1826477050781
Krok: 1797 389.1405944824219
Krok: 1798 389.0987854003906
Krok: 1799 389.0572509765625
Krok: 1800 389.0176696777344
Krok: 1801 388.9822082519531
Krok: 1802 388.9389343261719
Krok: 1803 388.8973083496094
Krok: 1804 388.8676452636719
Krok: 1805 388.83233642578125
Krok: 1806 388.7933044433594
Krok: 1807 388.7594909667969
Krok: 1808 388.72320556640625
Krok: 1809 388.68817138671875
Krok: 1810 388.65423583984375
Krok: 1811 388.6176452636719
Krok: 1812 388.5863342285156
Krok: 1813 388.5478210449219
Krok: 1814 388.5191345214844
Krok: 1815 388.478271484375
Krok: 1816 388.4512634277344
Krok: 1817 388.4101257324219
Krok: 1818 388.3763427734375
Krok: 1819 388.3463134765625
Krok: 1820 388.30950927734375
Krok: 1821 388.2826843261719
Krok: 1822 388.24322509765625
Krok: 1823 388.2095947265625
Krok: 1824 388.18182373046875
Krok: 1825 388.144775390625
Krok: 1826 388.1127014160156
Krok: 1827 388.0850830078125
Krok: 1828 388.0489196777344
Krok: 1829 388.0174255371094
Krok: 1830 387.99298095703125
Krok: 1831 387.95751953125
Krok: 1832 387.9263000488281
Krok: 1833 387.8961486816406
Krok: 1834 387.8665466308594
Krok: 1835 387.837158203125
Krok: 1836 387.8081359863281
Krok: 1837 387.77911376953125
Krok: 1838 387.75048828125
Krok: 1839 387.7259826660156
Krok: 1840 387.69464111328125
Krok: 1841 387.66546630859375
Krok: 1842 387.63702392578125
Krok: 1843 387.60888671875
Krok: 1844 387.5809631347656
Krok: 1845 387.5531921386719
Krok: 1846 387.525634765625
Krok: 1847 387.4971923828125
Krok: 1848 387.466796875
Krok: 1849 387.4369812011719
Krok: 1850 387.408935546875
Krok: 1851 387.3842468261719
Krok: 1852 387.35186767578125
Krok: 1853 387.3221130371094
Krok: 1854 387.2930603027344
Krok: 1855 387.2646179199219
Krok: 1856 387.2362365722656
Krok: 1857 387.2088928222656
Krok: 1858 387.18536376953125
Krok: 1859 387.1545104980469
Krok: 1860 387.1260070800781
Krok: 1861 387.0982666015625
Krok: 1862 387.0709228515625
Krok: 1863 387.043701171875
Krok: 1864 387.0167541503906
Krok: 1865 386.9899597167969
Krok: 1866 386.9625244140625
Krok: 1867 386.9347229003906
Krok: 1868 386.91107177734375
Krok: 1869 386.8785705566406
Krok: 1870 386.8487854003906
Krok: 1871 386.8206787109375
Krok: 1872 386.798583984375
Krok: 1873 386.7668151855469
Krok: 1874 386.73785400390625
Krok: 1875 386.707763671875
Krok: 1876 386.6810607910156
Krok: 1877 386.6463928222656
Krok: 1878 386.6232604980469
Krok: 1879 386.58587646484375
Krok: 1880 386.5561828613281
Krok: 1881 386.52935791015625
Krok: 1882 386.49700927734375
Krok: 1883 386.4724426269531
Krok: 1884 386.43743896484375
Krok: 1885 386.4146423339844
Krok: 1886 386.3787536621094
Krok: 1887 386.3567810058594
Krok: 1888 386.3211669921875
Krok: 1889 386.29937744140625
Krok: 1890 386.26434326171875
Krok: 1891 386.2422180175781
Krok: 1892 386.2083435058594
Krok: 1893 386.185546875
Krok: 1894 386.1529846191406
Krok: 1895 386.1292724609375
Krok: 1896 386.0984802246094
Krok: 1897 386.0738830566406
Krok: 1898 386.06451416015625
Krok: 1899 386.0202331542969
Krok: 1900 385.991943359375
Krok: 1901 385.9852294921875
Krok: 1902 385.94659423828125
Krok: 1903 385.92620849609375
Krok: 1904 385.9029541015625
Krok: 1905 385.8779296875
Krok: 1906 385.8558349609375
Krok: 1907 385.8344421386719
Krok: 1908 385.8134765625
Krok: 1909 385.792724609375
Krok: 1910 385.77215576171875
Krok: 1911 385.7518005371094
Krok: 1912 385.7315673828125
Krok: 1913 385.7115478515625
Krok: 1914 385.6914978027344
Krok: 1915 385.67156982421875
Krok: 1916 385.6518249511719
Krok: 1917 385.6321105957031
Krok: 1918 385.6125183105469
Krok: 1919 385.593017578125
Krok: 1920 385.5736083984375
Krok: 1921 385.5543518066406
Krok: 1922 385.53509521484375
Krok: 1923 385.5159606933594
Krok: 1924 385.496826171875
Krok: 1925 385.47796630859375
Krok: 1926 385.4591979980469
Krok: 1927 385.4403076171875
Krok: 1928 385.421630859375
Krok: 1929 385.4029846191406
Krok: 1930 385.38458251953125
Krok: 1931 385.3661193847656
Krok: 1932 385.3477783203125
Krok: 1933 385.3293762207031
Krok: 1934 385.3112487792969
Krok: 1935 385.29315185546875
Krok: 1936 385.2752685546875
Krok: 1937 385.2575988769531
Krok: 1938 385.2400207519531
Krok: 1939 385.22247314453125
Krok: 1940 385.2051086425781
Krok: 1941 385.18792724609375
Krok: 1942 385.1708984375
Krok: 1943 385.1539001464844
Krok: 1944 385.1369934082031
Krok: 1945 385.12005615234375
Krok: 1946 385.1034851074219
Krok: 1947 385.0867614746094
Krok: 1948 385.07025146484375
Krok: 1949 385.0538330078125
Krok: 1950 385.0373840332031
Krok: 1951 385.0210876464844
Krok: 1952 385.0047607421875
Krok: 1953 384.98779296875
Krok: 1954 384.97100830078125
Krok: 1955 384.9544372558594
Krok: 1956 384.93798828125
Krok: 1957 384.92169189453125
Krok: 1958 384.90533447265625
Krok: 1959 384.8891906738281
Krok: 1960 384.87310791015625
Krok: 1961 384.8571472167969
Krok: 1962 384.84124755859375
Krok: 1963 384.8254089355469
Krok: 1964 384.8096008300781
Krok: 1965 384.79388427734375
Krok: 1966 384.7782897949219
Krok: 1967 384.76275634765625
Krok: 1968 384.747314453125
Krok: 1969 384.7319030761719
Krok: 1970 384.71661376953125
Krok: 1971 384.701416015625
Krok: 1972 384.6861267089844
Krok: 1973 384.67108154296875
Krok: 1974 384.6560974121094
Krok: 1975 384.6410827636719
Krok: 1976 384.6262512207031
Krok: 1977 384.61138916015625
Krok: 1978 384.5966796875
Krok: 1979 384.5819396972656
Krok: 1980 384.5630798339844
Krok: 1981 384.5448913574219
Krok: 1982 384.52752685546875
Krok: 1983 384.5103454589844
Krok: 1984 384.49371337890625
Krok: 1985 384.47711181640625
Krok: 1986 384.46075439453125
Krok: 1987 384.44451904296875
Krok: 1988 384.428466796875
Krok: 1989 384.4124755859375
Krok: 1990 384.3966064453125
Krok: 1991 384.380859375
Krok: 1992 384.3650817871094
Krok: 1993 384.349609375
Krok: 1994 384.33392333984375
Krok: 1995 384.31854248046875
Krok: 1996 384.3031311035156
Krok: 1997 384.2879638671875
Krok: 1998 384.2727966308594
Krok: 1999 384.25762939453125
Krok: 2000 384.2426452636719
Krok: 2001 384.2276611328125
Krok: 2002 384.2127685546875
Krok: 2003 384.197998046875
Krok: 2004 384.1831359863281
Krok: 2005 384.16851806640625
Krok: 2006 384.15399169921875
Krok: 2007 384.1394348144531
Krok: 2008 384.125
Krok: 2009 384.11053466796875
Krok: 2010 384.09625244140625
Krok: 2011 384.0820007324219
Krok: 2012 384.06781005859375
Krok: 2013 384.05364990234375
Krok: 2014 384.0396423339844
Krok: 2015 384.0256042480469
Krok: 2016 384.0118103027344
Krok: 2017 383.99786376953125
Krok: 2018 383.98406982421875
Krok: 2019 383.9703369140625
Krok: 2020 383.95672607421875
Krok: 2021 383.9431457519531
Krok: 2022 383.9295349121094
Krok: 2023 383.91619873046875
Krok: 2024 383.9034423828125
Krok: 2025 383.8903503417969
Krok: 2026 383.8775939941406
Krok: 2027 383.86492919921875
Krok: 2028 383.85198974609375
Krok: 2029 383.8393249511719
Krok: 2030 383.8270263671875
Krok: 2031 383.8150329589844
Krok: 2032 383.8025207519531
Krok: 2033 383.790283203125
Krok: 2034 383.7783203125
Krok: 2035 383.76629638671875
Krok: 2036 383.7544250488281
Krok: 2037 383.7425231933594
Krok: 2038 383.7307434082031
Krok: 2039 383.7190856933594
Krok: 2040 383.707275390625
Krok: 2041 383.6955871582031
Krok: 2042 383.6841125488281
Krok: 2043 383.6724548339844
Krok: 2044 383.6611328125
Krok: 2045 383.6495666503906
Krok: 2046 383.6381530761719
Krok: 2047 383.6268615722656
Krok: 2048 383.6155090332031
Krok: 2049 383.60418701171875
Krok: 2050 383.5930480957031
Krok: 2051 383.58184814453125
Krok: 2052 383.5706787109375
Krok: 2053 383.5596618652344
Krok: 2054 383.54852294921875
Krok: 2055 383.5376892089844
Krok: 2056 383.5265808105469
Krok: 2057 383.51568603515625
Krok: 2058 383.5047607421875
Krok: 2059 383.4939880371094
Krok: 2060 383.48321533203125
Krok: 2061 383.47235107421875
Krok: 2062 383.46173095703125
Krok: 2063 383.45098876953125
Krok: 2064 383.4403381347656
Krok: 2065 383.42974853515625
Krok: 2066 383.41925048828125
Krok: 2067 383.4087829589844
Krok: 2068 383.3982849121094
Krok: 2069 383.3879089355469
Krok: 2070 383.37738037109375
Krok: 2071 383.36700439453125
Krok: 2072 383.35675048828125
Krok: 2073 383.34649658203125
Krok: 2074 383.33624267578125
Krok: 2075 383.3260498046875
Krok: 2076 383.31591796875
Krok: 2077 383.3057556152344
Krok: 2078 383.29559326171875
Krok: 2079 383.2855529785156
Krok: 2080 383.2756042480469
Krok: 2081 383.265625
Krok: 2082 383.25567626953125
Krok: 2083 383.245849609375
Krok: 2084 383.2359924316406
Krok: 2085 383.2261047363281
Krok: 2086 383.21630859375
Krok: 2087 383.2065124511719
Krok: 2088 383.19683837890625
Krok: 2089 383.187255859375
Krok: 2090 383.1775207519531
Krok: 2091 383.16796875
Krok: 2092 383.15838623046875
Krok: 2093 383.1488342285156
Krok: 2094 383.1392822265625
Krok: 2095 383.1298828125
Krok: 2096 383.1204528808594
Krok: 2097 383.1110534667969
Krok: 2098 383.1015930175781
Krok: 2099 383.09222412109375
Krok: 2100 383.08294677734375
Krok: 2101 383.07373046875
Krok: 2102 383.0645446777344
Krok: 2103 383.0552978515625
Krok: 2104 383.0460510253906
Krok: 2105 383.037109375
Krok: 2106 383.0279235839844
Krok: 2107 383.0187683105469
Krok: 2108 383.009765625
Krok: 2109 383.0008544921875
Krok: 2110 382.9918212890625
Krok: 2111 382.9827880859375
Krok: 2112 382.9739685058594
Krok: 2113 382.9650573730469
Krok: 2114 382.9561767578125
Krok: 2115 382.9473571777344
Krok: 2116 382.9385986328125
Krok: 2117 382.9299011230469
Krok: 2118 382.92132568359375
Krok: 2119 382.9127502441406
Krok: 2120 382.904296875
Krok: 2121 382.8958435058594
Krok: 2122 382.8873596191406
Krok: 2123 382.8789367675781
Krok: 2124 382.8706359863281
Krok: 2125 382.8623046875
Krok: 2126 382.85400390625
Krok: 2127 382.8458251953125
Krok: 2128 382.8375244140625
Krok: 2129 382.829345703125
Krok: 2130 382.82122802734375
Krok: 2131 382.81304931640625
Krok: 2132 382.8049011230469
Krok: 2133 382.79681396484375
Krok: 2134 382.7887878417969
Krok: 2135 382.7806091308594
Krok: 2136 382.772705078125
Krok: 2137 382.76470947265625
Krok: 2138 382.7567138671875
Krok: 2139 382.7488098144531
Krok: 2140 382.7409362792969
Krok: 2141 382.7330017089844
Krok: 2142 382.72515869140625
Krok: 2143 382.7173156738281
Krok: 2144 382.7095031738281
Krok: 2145 382.7018127441406
Krok: 2146 382.6940002441406
Krok: 2147 382.68634033203125
Krok: 2148 382.6785583496094
Krok: 2149 382.67095947265625
Krok: 2150 382.6632385253906
Krok: 2151 382.65557861328125
Krok: 2152 382.64813232421875
Krok: 2153 382.6404724121094
Krok: 2154 382.6329650878906
Krok: 2155 382.62542724609375
Krok: 2156 382.6179504394531
Krok: 2157 382.6104431152344
Krok: 2158 382.6030578613281
Krok: 2159 382.5955810546875
Krok: 2160 382.5882263183594
Krok: 2161 382.58087158203125
Krok: 2162 382.5735168457031
Krok: 2163 382.5661926269531
Krok: 2164 382.558837890625
Krok: 2165 382.55157470703125
Krok: 2166 382.5442199707031
Krok: 2167 382.537109375
Krok: 2168 382.5299377441406
Krok: 2169 382.5226135253906
Krok: 2170 382.5155029296875
Krok: 2171 382.50836181640625
Krok: 2172 382.5011901855469
Krok: 2173 382.4941101074219
Krok: 2174 382.487060546875
Krok: 2175 382.4800720214844
Krok: 2176 382.4730224609375
Krok: 2177 382.4659729003906
Krok: 2178 382.45904541015625
Krok: 2179 382.4520568847656
Krok: 2180 382.4451599121094
Krok: 2181 382.4381408691406
Krok: 2182 382.431396484375
Krok: 2183 382.4244384765625
Krok: 2184 382.4175720214844
Krok: 2185 382.4108581542969
Krok: 2186 382.4039001464844
Krok: 2187 382.397216796875
Krok: 2188 382.3904724121094
Krok: 2189 382.3836669921875
Krok: 2190 382.3768615722656
Krok: 2191 382.37017822265625
Krok: 2192 382.363525390625
Krok: 2193 382.35687255859375
Krok: 2194 382.3502502441406
Krok: 2195 382.3436279296875
Krok: 2196 382.3370056152344
Krok: 2197 382.33050537109375
Krok: 2198 382.3239440917969
Krok: 2199 382.31732177734375
Krok: 2200 382.3109130859375
Krok: 2201 382.30426025390625
Krok: 2202 382.2978210449219
Krok: 2203 382.29132080078125
Krok: 2204 382.2848815917969
Krok: 2205 382.2784423828125
Krok: 2206 382.2720947265625
Krok: 2207 382.2657165527344
Krok: 2208 382.2593078613281
Krok: 2209 382.2529296875
Krok: 2210 382.2466125488281
Krok: 2211 382.2403564453125
Krok: 2212 382.2340393066406
Krok: 2213 382.2276611328125
Krok: 2214 382.22137451171875
Krok: 2215 382.2153015136719
Krok: 2216 382.20892333984375
Krok: 2217 382.2027587890625
Krok: 2218 382.1965637207031
Krok: 2219 382.1904296875
Krok: 2220 382.1842956542969
Krok: 2221 382.17822265625
Krok: 2222 382.17205810546875
Krok: 2223 382.16595458984375
Krok: 2224 382.1600036621094
Krok: 2225 382.1539611816406
Krok: 2226 382.1478576660156
Krok: 2227 382.1418151855469
Krok: 2228 382.1358642578125
Krok: 2229 382.1297607421875
Krok: 2230 382.1238098144531
Krok: 2231 382.1178283691406
Krok: 2232 382.11181640625
Krok: 2233 382.10595703125
Krok: 2234 382.1001892089844
Krok: 2235 382.09423828125
Krok: 2236 382.0883483886719
Krok: 2237 382.08251953125
Krok: 2238 382.07672119140625
Krok: 2239 382.0708923339844
Krok: 2240 382.0650939941406
Krok: 2241 382.0592956542969
Krok: 2242 382.05352783203125
Krok: 2243 382.0478210449219
Krok: 2244 382.04205322265625
Krok: 2245 382.0364074707031
Krok: 2246 382.0306396484375
Krok: 2247 382.0249938964844
Krok: 2248 382.019287109375
Krok: 2249 382.013671875
Krok: 2250 382.008056640625
Krok: 2251 382.0024719238281
Krok: 2252 381.9967956542969
Krok: 2253 381.9912109375
Krok: 2254 381.98565673828125
Krok: 2255 381.9800720214844
Krok: 2256 381.9744567871094
Krok: 2257 381.9688415527344
Krok: 2258 381.96343994140625
Krok: 2259 381.9578857421875
Krok: 2260 381.9523620605469
Krok: 2261 381.9468078613281
Krok: 2262 381.94134521484375
Krok: 2263 381.93585205078125
Krok: 2264 381.9304504394531
Krok: 2265 381.9250183105469
Krok: 2266 381.9195861816406
Krok: 2267 381.9141845703125
Krok: 2268 381.9088134765625
Krok: 2269 381.9033203125
Krok: 2270 381.8981018066406
Krok: 2271 381.89276123046875
Krok: 2272 381.887451171875
Krok: 2273 381.8820495605469
Krok: 2274 381.87677001953125
Krok: 2275 381.8714294433594
Krok: 2276 381.8660888671875
Krok: 2277 381.86090087890625
Krok: 2278 381.85565185546875
Krok: 2279 381.8504943847656
Krok: 2280 381.8453063964844
Krok: 2281 381.8401184082031
Krok: 2282 381.8349304199219
Krok: 2283 381.8296813964844
Krok: 2284 381.8245544433594
Krok: 2285 381.8194274902344
Krok: 2286 381.81427001953125
Krok: 2287 381.8091735839844
Krok: 2288 381.8040771484375
Krok: 2289 381.7989501953125
Krok: 2290 381.7938537597656
Krok: 2291 381.788818359375
Krok: 2292 381.7838439941406
Krok: 2293 381.77862548828125
Krok: 2294 381.7734375
Krok: 2295 381.7684020996094
Krok: 2296 381.76318359375
Krok: 2297 381.75811767578125
Krok: 2298 381.75299072265625
Krok: 2299 381.748046875
Krok: 2300 381.74298095703125
Krok: 2301 381.73797607421875
Krok: 2302 381.7329406738281
Krok: 2303 381.7279052734375
Krok: 2304 381.7229309082031
Krok: 2305 381.71795654296875
Krok: 2306 381.7129211425781
Krok: 2307 381.70794677734375
Krok: 2308 381.70306396484375
Krok: 2309 381.6980895996094
Krok: 2310 381.6932067871094
Krok: 2311 381.6883544921875
Krok: 2312 381.6835632324219
Krok: 2313 381.6786193847656
Krok: 2314 381.6737976074219
Krok: 2315 381.66888427734375
Krok: 2316 381.6640319824219
Krok: 2317 381.6593017578125
Krok: 2318 381.6544494628906
Krok: 2319 381.649658203125
Krok: 2320 381.64483642578125
Krok: 2321 381.6401062011719
Krok: 2322 381.6352844238281
Krok: 2323 381.630615234375
Krok: 2324 381.6259460449219
Krok: 2325 381.6211853027344
Krok: 2326 381.616455078125
Krok: 2327 381.6116943359375
Krok: 2328 381.6070556640625
Krok: 2329 381.60235595703125
Krok: 2330 381.59771728515625
Krok: 2331 381.59307861328125
Krok: 2332 381.58843994140625
Krok: 2333 381.5838317871094
Krok: 2334 381.57928466796875
Krok: 2335 381.5746765136719
Krok: 2336 381.5700988769531
Krok: 2337 381.5655212402344
Krok: 2338 381.56097412109375
Krok: 2339 381.5563659667969
Krok: 2340 381.55181884765625
Krok: 2341 381.5473327636719
Krok: 2342 381.542724609375
Krok: 2343 381.53826904296875
Krok: 2344 381.5337829589844
Krok: 2345 381.5293884277344
Krok: 2346 381.52484130859375
Krok: 2347 381.5204772949219
Krok: 2348 381.5159606933594
Krok: 2349 381.5115051269531
Krok: 2350 381.507080078125
Krok: 2351 381.50250244140625
Krok: 2352 381.4979553222656
Krok: 2353 381.4934997558594
Krok: 2354 381.489013671875
Krok: 2355 381.4844970703125
Krok: 2356 381.4800109863281
Krok: 2357 381.47564697265625
Krok: 2358 381.4712219238281
Krok: 2359 381.4668884277344
Krok: 2360 381.4623718261719
Krok: 2361 381.45806884765625
Krok: 2362 381.45379638671875
Krok: 2363 381.4493103027344
Krok: 2364 381.4450378417969
Krok: 2365 381.4406433105469
Krok: 2366 381.436279296875
Krok: 2367 381.4320983886719
Krok: 2368 381.4277648925781
Krok: 2369 381.4234619140625
Krok: 2370 381.41912841796875
Krok: 2371 381.4148864746094
Krok: 2372 381.4106140136719
Krok: 2373 381.40643310546875
Krok: 2374 381.40216064453125
Krok: 2375 381.3979187011719
Krok: 2376 381.3937072753906
Krok: 2377 381.3894958496094
Krok: 2378 381.3854064941406
Krok: 2379 381.38116455078125
Krok: 2380 381.3771057128906
Krok: 2381 381.3728942871094
Krok: 2382 381.3688049316406
Krok: 2383 381.3645324707031
Krok: 2384 381.3605041503906
Krok: 2385 381.35638427734375
Krok: 2386 381.3521728515625
Krok: 2387 381.3481140136719
Krok: 2388 381.34405517578125
Krok: 2389 381.3399658203125
Krok: 2390 381.33587646484375
Krok: 2391 381.3318176269531
Krok: 2392 381.3277282714844
Krok: 2393 381.3237609863281
Krok: 2394 381.3197326660156
Krok: 2395 381.3157958984375
Krok: 2396 381.3117370605469
Krok: 2397 381.30780029296875
Krok: 2398 381.3038330078125
Krok: 2399 381.2998962402344
Krok: 2400 381.2958984375
Krok: 2401 381.29193115234375
Krok: 2402 381.2878723144531
Krok: 2403 381.2840576171875
Krok: 2404 381.28009033203125
Krok: 2405 381.2761535644531
Krok: 2406 381.272216796875
Krok: 2407 381.268310546875
Krok: 2408 381.2644348144531
Krok: 2409 381.2605895996094
Krok: 2410 381.2567138671875
Krok: 2411 381.2528991699219
Krok: 2412 381.2491149902344
Krok: 2413 381.24530029296875
Krok: 2414 381.24139404296875
Krok: 2415 381.2375183105469
Krok: 2416 381.2337951660156
Krok: 2417 381.22998046875
Krok: 2418 381.2261657714844
Krok: 2419 381.222412109375
Krok: 2420 381.2186279296875
Krok: 2421 381.21484375
Krok: 2422 381.2110290527344
Krok: 2423 381.2072448730469
Krok: 2424 381.2035827636719
Krok: 2425 381.1999206542969
Krok: 2426 381.1961975097656
Krok: 2427 381.19244384765625
Krok: 2428 381.1888122558594
Krok: 2429 381.1850891113281
Krok: 2430 381.1814270019531
Krok: 2431 381.1777648925781
Krok: 2432 381.1741027832031
Krok: 2433 381.17047119140625
Krok: 2434 381.1668701171875
Krok: 2435 381.1631774902344
Krok: 2436 381.1595764160156
Krok: 2437 381.1558837890625
Krok: 2438 381.1522521972656
Krok: 2439 381.14874267578125
Krok: 2440 381.1451110839844
Krok: 2441 381.1415710449219
Krok: 2442 381.13800048828125
Krok: 2443 381.1344299316406
Krok: 2444 381.130859375
Krok: 2445 381.12750244140625
Krok: 2446 381.1257629394531
Krok: 2447 381.121337890625
Krok: 2448 381.1175537109375
Krok: 2449 381.1139221191406
Krok: 2450 381.11077880859375
Krok: 2451 381.10906982421875
Krok: 2452 381.1045837402344
Krok: 2453 381.10076904296875
Krok: 2454 381.0980529785156
Krok: 2455 381.0961608886719
Krok: 2456 381.091552734375
Krok: 2457 381.087890625
Krok: 2458 381.0865478515625
Krok: 2459 381.08197021484375
Krok: 2460 381.0786437988281
Krok: 2461 381.0770568847656
Krok: 2462 381.0724792480469
Krok: 2463 381.0721435546875
Krok: 2464 381.0668640136719
Krok: 2465 381.06341552734375
Krok: 2466 381.0617370605469
Krok: 2467 381.057861328125
Krok: 2468 381.05670166015625
Krok: 2469 381.052001953125
Krok: 2470 381.05108642578125
Krok: 2471 381.0462646484375
Krok: 2472 381.04541015625
Krok: 2473 381.0406188964844
Krok: 2474 381.03955078125
Krok: 2475 381.0350341796875
Krok: 2476 381.03363037109375
Krok: 2477 381.0295715332031
Krok: 2478 381.02777099609375
Krok: 2479 381.0240783691406
Krok: 2480 381.0218811035156
Krok: 2481 381.0187683105469
Krok: 2482 381.0160217285156
Krok: 2483 381.0134582519531
Krok: 2484 381.0102844238281
Krok: 2485 381.00830078125
Krok: 2486 381.00445556640625
Krok: 2487 381.0030822753906
Krok: 2488 380.99871826171875
Krok: 2489 380.9979553222656
Krok: 2490 380.99298095703125
Krok: 2491 380.9928283691406
Krok: 2492 380.9878234863281
Krok: 2493 380.98443603515625
Krok: 2494 380.983154296875
Krok: 2495 380.9791259765625
Krok: 2496 380.9785461425781
Krok: 2497 380.97381591796875
Krok: 2498 380.97119140625
Krok: 2499 380.9694519042969
In [17]:
import matplotlib.pyplot as plt

plt.plot(range(epochs), aggregated_losses)
plt.ylabel('Loss')
plt.xlabel('epoch')

plt.show
Out[17]:
<function matplotlib.pyplot.show(*args, **kw)>

Prognoza na podstawie modelu

  • poprostu podstawiamy te same równania które były w modelu
  • Poniższy wynik loss pokazuje ostatnią sekwencje modelu
  • Loss pokazuuje ile myli się model (loss = suma kwadratu błedów) po ostatniej sekwencji uczenia się
In [18]:
with torch.no_grad():
    y_pred = h_relu.mm(w2)  
    loss = (y_pred - y).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 380.96923828

Ponieważ ustaliliśmy, że nasza warstwa wyjściowa będzie zawierać 1 neuron, każda prognoza będzie zawierać 1 wartości. Przykładowo pierwsze 5 przewidywanych wartości wygląda następująco:

In [19]:
y_pred[:5]
Out[19]:
tensor([[2.4883],
        [4.3975],
        [4.2834],
        [3.4128],
        [5.2886]])

Celem takich prognoz jest to, że jeśli rzeczywisty wynik wynosi 0, wartość przy indeksie 0 powinna być wyższa niż wartość przy indeksie 1 i odwrotnie. Możemy pobrać indeks największej wartości z listy za pomocą następującego skryptu:

  • np.argmax – Zwraca wskaźniki wartości maksymalnych wzdłuż osi.

NIC Z TEGO NIE ROZUMIEM

In [20]:
import numpy as np
y_pred2 = np.argmax(y_pred, axis=1)
y_pred2[:6]
Out[20]:
tensor([0, 0, 0, 0, 0, 0])

Powyższe równanie zwraca wskaźniki wartości maksymalnych wzdłuż osi.

Ponieważ na liście pierwotnie przewidywanych wyników y_pred dla pierwszych pięciu rekordów wartości przy zerowych indeksach są większe niż wartości przy pierwszych indeksach, możemy zobaczyć 0 w pierwszych pięciu wierszach przetworzonych danych wyjściowych.

In [21]:
y_pred[:4]
Out[21]:
tensor([[2.4883],
        [4.3975],
        [4.2834],
        [3.4128]])

Zapisujemy cały model

TA siec jest tak prosta, że nie można zrobić jej zapisu – bo nie ma zapisanej swojej definicji

Użycie bojowe modelu

obraz.png

Podstawiając inne zmienne niezależne można uzyskać wektor zmiennych wyjściowych

Wybieramy sobie jakąś losowy rekord 1

In [22]:
df2 = df.sample(frac = 0.01, random_state=10) 
print(df2.shape)
print()
df2
(5, 13)

Out[22]:
Unnamed: 0 Country Region Happiness Rank Happiness Score Economy (GDP per Capita) Family Health (Life Expectancy) Freedom Trust (Government Corruption) Generosity Dystopia Residual Year
315 315 Tunisia Middle East and Northern Africa 98.0 5.045 0.97724 0.43165 0.59577 0.23553 0.08170 0.03936 2.68413 2016.0
149 149 Trinidad And Tobago Latin America and Caribbean 41.0 6.168 1.21183 1.18354 0.61483 0.55884 0.01140 0.31844 2.26882 2015.0
198 198 Congo (Kinshasa) Sub-Saharan Africa 125.0 4.272 0.05661 0.80676 0.18800 0.15602 0.06075 0.25458 2.74924 2016.0
167 167 Algeria Middle East and Northern Africa 38.0 6.355 1.05266 0.83309 0.61804 0.21006 0.16157 0.07044 3.40904 2016.0
48 48 Gabon Sub-Saharan Africa 143.0 3.896 1.06024 0.90528 0.43372 0.31914 0.11091 0.06822 0.99895 2015.0

Bierzemy te same zmienne co w modelu

In [40]:
X_exp = torch.tensor(df2[['Economy (GDP per Capita)','Freedom','Trust (Government Corruption)']].values)
print(X_exp.shape)
print(X_exp)
print()
torch.Size([5, 3])
tensor([[0.9772, 0.2355, 0.0817],
        [1.2118, 0.5588, 0.0114],
        [0.0566, 0.1560, 0.0607],
        [1.0527, 0.2101, 0.1616],
        [1.0602, 0.3191, 0.1109]], dtype=torch.float64)

X_exp to Duble tensor – przerabiamy go na Float Tensor

In [46]:
X_exp = X_exp.type(torch.FloatTensor)
X_exp
Out[46]:
tensor([[0.9772, 0.2355, 0.0817],
        [1.2118, 0.5588, 0.0114],
        [0.0566, 0.1560, 0.0608],
        [1.0527, 0.2101, 0.1616],
        [1.0602, 0.3191, 0.1109]])

Zmienna wynikowa y

In [29]:
w01 = np.random.randn(3, 5)
In [49]:
w01 = torch.tensor(w01)
w01 = w01.type(torch.FloatTensor)
w01
/home/wojciech/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:1: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
  """Entry point for launching an IPython kernel.
Out[49]:
tensor([[-0.8869, -0.8968, -2.4913,  0.2018, -0.2955],
        [-0.7605, -0.6304,  0.7503, -0.6016,  0.7678],
        [ 1.5129,  0.6051,  0.8185,  0.6681,  1.2359]])
In [54]:
y_exp = torch.tensor(df2['Happiness Score'].values)
y_exp = y_exp.type(torch.FloatTensor)
y_exp
Out[54]:
tensor([5.0450, 6.1680, 4.2720, 6.3550, 3.8960])
In [55]:
y_exp = y_exp.view(y_exp.shape[0],1)
y[:12]
Out[55]:
tensor([[3.5750],
        [4.9590],
        [5.6050],
        [4.0330],
        [6.5740],
        [4.3500],
        [7.2840],
        [7.2000],
        [5.2120],
        [5.9600],
        [4.6940],
        [5.8130]])
In [57]:
print("X_exp:",X_exp.shape)
print("w1:",w1.shape)
X_exp: torch.Size([5, 3])
w1: torch.Size([3, 10])

Podstawiamy do gotowego, zrobionego wczesniej modelu

In [59]:
    h = X_exp.mm(w1)               #<= zwykłe mnożenie macierzy x*w1
    h_relu = h.clamp(min=0)        #<= wyznaczenie ograniczenia do min=0
    y_pred_AB = h_relu.mm(w2)    

    loss = (y_pred_AB - y_exp).pow(2).sum()

    print(f'Loss train_set: {loss:.8f}')
Loss train_set: 17.88477898
In [60]:
  h = X.mm(w1)               #<= zwykłe mnożenie macierzy x*w1
  h_relu = h.clamp(min=0)    #<= wyznaczenie ograniczenia do min=0
  y_pred = h_relu.mm(w2)

Wektor wynikowy

In [61]:
y_pred_AB
Out[61]:
tensor([[3.5351],
        [6.1301],
        [1.4422],
        [3.6204],
        [4.2392]])
In [62]:
y_exp
Out[62]:
tensor([[5.0450],
        [6.1680],
        [4.2720],
        [6.3550],
        [3.8960]])

obraz.png

Obliczenie parametru R2

In [66]:
(y_exp - y_pred_AB).pow(2).sum()
Out[66]:
tensor(17.8848)

Artykuł Pytorch regression _1.1_[WorldHappinessReport] pochodzi z serwisu THE DATA SCIENCE LIBRARY.

]]>