Deciphering解读 the Markets with Technical Analysis
In this chapter, we will go through some popular methods of technical analysis and show how to apply them while analyzing market data. We will perform basic algorithmic trading using market trends, support, and resistance.
You may be thinking of how we can come up with our own strategies? And are there any naive strategies that worked in the past that we can use by way of reference?
As you read in the first chapter https://blog.csdn.net/Linli522362242/article/details/121337016, mankind has been trading assets for centuries. Numerous strategies have been created to increase the profit or sometimes just to keep the same profit. In this zero-sum game, the competition is considerable. It necessitates a constant innovation in terms of trading models and also in terms of technology. In this race to get the biggest part of the pie first, it is important to know the basic foundation of analysis in order to create trading strategies. When predicting the market, we mainly assume that the past repeats itself in future. In order to predict future prices and volumes, technical analysts study the historical market data. Based on behavioral economics and quantitative analysis, the market data is divided into two main areas.
First, are chart patterns. This side of technical analysis is based on recognizing trading patterns and anticipating[ænˈtɪsɪpeɪtɪŋ]预期 when they will reproduce in the future. This is usually more difficult to implement.
Second, are technical indicators. This other side uses mathematical calculation to forecast the financial market direction. The list of technical indicators is sufficiently long to fill an entire book on this topic alone, but they are composed of a few different principal domains: trend, momentum, volume, volatility, and support and resistance. We will focus on the support and resistance strategy as an example to illustrate one of the most well-known technical analysis approaches.
In this chapter, we will cover the following topics:
Trading strategies based on trend and momentum are pretty similar. If we can use a metaphor比喻 to illustrate the difference, the trend strategy uses speed, whereas the momentum strategy uses acceleration. With the trend strategy, we will study the price historical data. If this price keeps increasing for the last fixed amount of days, we will open a long position (Long positions make money when market prices are higher than the price of the position, and lose money when market prices are lower than the price of the position.) by assuming that the price will keep raising.
The trading strategy based on momentum is a technique where we send orders based on the strength of past behavior. The price momentum is the quantity of motion that a price has. The underlying rule is to bet that an asset price with a strong movement in a given direction will keep going in the same direction in the future. We will review a number of technical indicators expressing momentum in the market. Support and resistance are examples of indicators predicting future behavior.
In the first chapter, we explained the principle of the evolution of prices based on supply and demand. The price decreases when there is an increase in supply, and the price increases when demand rises.
This exploits the market psychology of investors following this trend of buying when the price is low and selling when the price is high.
To illustrate an example of a technical indicator (in this part, support and resistance), we will use the Google data from the first chapter https://blog.csdn.net/Linli522362242/article/details/121337016. Since you will use the data for testing many times, you should store this data frame to your disk. Doing this will help you save time when you want to replay the data. To avoid complications with stock split, we will only take dates without splits. Therefore, we will keep only 620 days. Let's have a look at the following code:
import pandas as pd
from pandas_datareader import data
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
except:
# Call the function DataReader from the class data
goog_data2 = data.DataReader( 'GOOG', # ticker
'yahoo', # source
start_date, end_date
)
goog_data2.to_pickle( SRC_DATA_FILENAME )
In the following code, the following applies:
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(8,6) )
ax1 = fig.add_subplot( 111 )
ax1.plot( highs, color='c', lw=2. )
ax1.plot( lows, color='y', lw=2. )
plt.hlines( highs.head(200).max(), lows.index.values[0],
lows.index.values[-1],
linewidth=2, color='g'
)
plt.hlines( lows.head(200).min(), lows.index.values[0],
lows.index.values[-1],
linewidth=2, color='r'
)
# why not use .vlines since it need to provide the values of ymin and ymax
plt.axvline( x=lows.index.values[200], # ymin=0, ymax=1
linewidth=3, color='b', linestyle='--'
)
plt.setp( ax1.get_xticklabels(), rotation=45, horizontalalignment='right', fontsize=12 )
# plt.xticks(fontsize=14)
plt.yticks(fontsize=12)
ax1.set_ylabel('Google price in $', fontsize=14, rotation=90)
plt.show()
In this plot, the following applies:
In the middle of the following chart, we show three fixed-size time windows. We took care of adding the tolerance margin that we will consider to be sufficiently close to the limits (support and resistance):
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(10,6) )
ax1 = fig.add_subplot( 111 )
ax1.plot( highs, color='c', lw=2. )
ax1.plot( lows, color='y', lw=2. )
plt.hlines( highs.head(200).max(), lows.index.values[0],
lows.index.values[-1],
linewidth=2, color='g'
)
plt.hlines( lows.head(200).min(), lows.index.values[0],
lows.index.values[-1],
linewidth=2, color='r'
)
# adding the tolerance margin to be close to the limits (support and resistance)
plt.fill_betweenx( [ highs.head(200).max()*0.96, highs.head(200).max() ],
lows.index.values[200], lows.index.values[400],
facecolor='green', alpha=0.5
)
plt.fill_betweenx( [ lows.head(200).min(), lows.head(200).min() * 1.05 ],
lows.index.values[200], lows.index.values[400],
facecolor='r', alpha=0.5
)
# why not use .vlines since it need to provide the values of ymin and ymax
plt.axvline( x=lows.index.values[200], # ymin=0, ymax=1
linewidth=3, color='b', linestyle='--'
)
plt.axvline( x=lows.index.values[400], # ymin=0, ymax=1
linewidth=3, color='b', linestyle=':'
)
plt.setp( ax1.get_xticklabels(), rotation=45, horizontalalignment='right', fontsize=12 )
# plt.xticks(fontsize=14)
plt.yticks(fontsize=12)
ax1.set_ylabel('Google price in $', fontsize=14, rotation=90)
plt.show()
If we take a new 200-day window after the first one, the support/resistance levels will be recalculated. We observe that the trading strategy will not get rid of the GOOG position (while the market keeps raising) since the price does not go back to the support level.
Since the algorithm cannot get rid of a position, we will need to add more parameters to change the behavior in order to enter a position. The following parameters can be added to the algorithm to change its position:
This phase is critical when creating your trading strategy. You will start by observing how your trading idea will perform using historical data, and then you will increase the number of parameters of this strategy to adjust to more realistic test cases.
In our example, we can introduce two further parameters:
Let's now have a look at the code:
import pandas as pd
from pandas_datareader import data
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data.pkl'
try:
goog_data = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
# Call the function DataReader from the class data
goog_data = data.DataReader( 'GOOG', # ticker
'yahoo', # source
start_date, end_date
)
goog_data.to_pickle( SRC_DATA_FILENAME )
goog_data_signal = pd.DataFrame( index=goog_data.index )
goog_data_signal['price'] = goog_data['Adj Close']
###################
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data2.head()
goog_data_signal.head()
Now, let's have a look at the other part of the code where we will implement the trading strategy:
import numpy as np
# a shorter rolling window.
def trading_support_resistance( data, bin_width=20 ):
# tolerance margin of what we consider being close to the support/resistance level
data['sup_tolerance'] = np.zeros( len(data) )
data['res_tolerance'] = np.zeros( len(data) )
# count the number of times the price reaches a support or resistance line.
data['sup_count'] = np.zeros( len(data) )
data['res_count'] = np.zeros( len(data) )
data['sup'] = np.zeros( len(data) )
data['res'] = np.zeros( len(data) )
data['positions'] = np.zeros( len(data) )
data['signal'] = np.zeros( len(data) )
in_support=0
in_resistance=0
# assume len(data) >= 2*window_size, then jump over first window_size,
# and window_size=bin_width
for idx in range( bin_width-1+bin_width, len(data)):
data_section = data[idx-bin_width:idx+1] # start_idx(hidden:jump):idx-bin_width=bin_width-1
# The level of support and resistance is calculated by
# taking the maximum and minimum price and
# then subtracting and adding a 20% margin.
support_level = min( data_section['price'] )
resistance_level = max( data_section['price'] )
data['sup'][idx] = support_level
data['res'][idx] = resistance_level
range_level = resistance_level-support_level
data['sup_tolerance'][idx] = support_level + 0.2*range_level
data['res_tolerance'][idx] = resistance_level - 0.2*range_level
if data['res_tolerance'][idx] <= data['price'][idx] <= data['res'][idx]:
in_resistance+=1
data['res_count'][idx] = in_resistance
elif data['sup'][idx] <= data['price'][idx] <= data['sup_tolerance'][idx]:
in_support+=1
data['sup_count'][idx] = in_support
else:
in_support = 0
in_resistance=0
if in_resistance>2: # The price is continuously hovering within the resistance margin
data['signal'][idx] = 1 # The price may reach or break through the resistance level
elif in_support>2: # The price is continuously hovering within the support margin
data['signal'][idx] = 0 # The price may reach or break through the support level
else:
data['signal'][idx] = data['signal'][idx-1]
data['positions'] = data['signal'].diff()# (long) positions>0 ==> buy, positions=0 ==> wait
# (short) positions<0 ==> sell
trading_support_resistance( goog_data_signal )
goog_data_signal.info()
goog_data_signal.reset_index(inplace=True) ###########
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(8,6))
ax1 = fig.add_subplot( 111, ylabel='Google price in $' )
ax1.plot( goog_data_signal['Date'][40:],
goog_data_signal['sup'][40:],
color='g', lw=2., label='sup' )
ax1.plot( goog_data_signal['Date'][40:],
goog_data_signal['res'][40:],
color='b', lw=2., label='res')
ax1.plot( goog_data_signal['Date'],
goog_data_signal['price'],
color='r', lw=2., label='price'
)
# draw an up arrow when we buy one Google share:
ax1.plot( goog_data_signal[ goog_data_signal.positions == 1 ]['Date'],
goog_data_signal[ goog_data_signal.positions == 1 ]['price'],
'^', markersize=7, color='k', label='buy',
)
ax1.plot( goog_data_signal.loc[goog_data_signal.positions==-1.0]['Date'],
goog_data_signal[goog_data_signal.positions == -1.0]['price'],
'v', markersize=7, color='y', label='sell',
)
ax1.set_xlabel('Date')
plt.setp( ax1.get_xticklabels(), rotation=45, horizontalalignment='right' )
plt.legend()
plt.show()
The codes will return the following output. The plot shows a 20-day rolling window calculating resistance and support(note we jump over fist window (window_size=20), use the data from second window_size):
From this plot, it is observed that a buy order is sent when a price stays in the resistance tolerance margin for 2 consecutive days, and that a sell order is sent when a price stays in the support tolerance margin for 2 consecutive days.
############################
why we jumped over fist window (window_size=20), used the data from second window_size?
import pandas as pd
from pandas_datareader import data
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data.pkl'
try:
goog_data = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
# Call the function DataReader from the class data
goog_data = data.DataReader( 'GOOG', # ticker
'yahoo', # source
start_date, end_date
)
goog_data.to_pickle( SRC_DATA_FILENAME )
goog_data_signal = pd.DataFrame( index=goog_data.index )
goog_data_signal['price'] = goog_data['Adj Close']
# a shorter rolling window.
def trading_support_resistance( data, bin_width=20 ):
# tolerance margin of what we consider being close to the support/resistance level
data['sup_tolerance'] = np.zeros( len(data) )
data['res_tolerance'] = np.zeros( len(data) )
# count the number of times the price reaches a support or resistance line.
data['sup_count'] = np.zeros( len(data) )
data['res_count'] = np.zeros( len(data) )
data['sup'] = np.zeros( len(data) )
data['res'] = np.zeros( len(data) )
data['positions'] = np.zeros( len(data) )
data['signal'] = np.zeros( len(data) )
in_support=0
in_resistance=0
# assume len(data) >= 2*window_size, then jump over first window_size,
# and window_size=bin_width
for idx in range( bin_width-1, len(data)):###
data_section = data[idx-bin_width+1:idx] # start_idx(hidden:jump):idx-bin_width=bin_width-1
# The level of support and resistance is calculated by
# taking the maximum and minimum price and
# then subtracting and adding a 20% margin.
support_level = min( data_section['price'] )
resistance_level = max( data_section['price'] )
data['sup'][idx] = support_level
data['res'][idx] = resistance_level
range_level = resistance_level-support_level
data['sup_tolerance'][idx] = support_level + 0.2*range_level
data['res_tolerance'][idx] = resistance_level - 0.2*range_level
if data['res_tolerance'][idx] <= data['price'][idx] <= data['res'][idx]:
in_resistance+=1
data['res_count'][idx] = in_resistance
elif data['sup'][idx] <= data['price'][idx] <= data['sup_tolerance'][idx]:
in_support+=1
data['sup_count'][idx] = in_support
else:
in_support = 0
in_resistance=0
if in_resistance>2: # The price is continuously hovering within the resistance margin
data['signal'][idx] = 1 # The price may reach or break through the resistance level
elif in_support>2: # The price is continuously hovering within the support margin
data['signal'][idx] = 0 # The price may reach or break through the support level
else:
data['signal'][idx] = data['signal'][idx-1]
data['positions'] = data['signal'].diff()# (long) positions>0 ==> buy, positions=0 ==> wait
# (short) positions<0 ==> sell
trading_support_resistance( goog_data_signal )
goog_data_signal.reset_index(inplace=True)
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(8,6))
ax1 = fig.add_subplot( 111, ylabel='Google price in $' )
ax1.plot( goog_data_signal['Date'][20:],###
goog_data_signal['sup'][20:], ###
color='g', lw=2., label='sup' )###
ax1.plot( goog_data_signal['Date'][20:],###
goog_data_signal['res'][20:], ###
color='b', lw=2., label='res')
ax1.plot( goog_data_signal['Date'],
goog_data_signal['price'],
color='r', lw=2., label='price'
)
# draw an up arrow when we buy one Google share:
ax1.plot( goog_data_signal[ goog_data_signal.positions == 1 ]['Date'],
goog_data_signal[ goog_data_signal.positions == 1 ]['price'],
'^', markersize=7, color='k', label='buy',
)
ax1.plot( goog_data_signal.loc[goog_data_signal.positions==-1.0]['Date'],
goog_data_signal[goog_data_signal.positions == -1.0]['price'],
'v', markersize=7, color='y', label='sell',
)
ax1.set_xlabel('Date')
plt.setp( ax1.get_xticklabels(), rotation=45, horizontalalignment='right' )
plt.legend()
plt.show()
vs We found that the adjusted close price line overlaps with the support level line, and it is very dangerous to fail to respond in time ( without selling the goog share will let us lose more money)
initial_capital = float( 1000.0 )
positions = pd.DataFrame( index=goog_data_signal.index ).fillna(0.0)
portfolio = pd.DataFrame( index=goog_data_signal.index ).fillna(0.0)
# Next, we will store the GOOG positions in the following data frame:
positions['GOOG'] = goog_data_signal['signal'] # 1(buy): daily_difference > 0, 0(sell): daily_difference <= 0
# Then, we will store the amount of the GOOG positions for the portfolio in this one:
portfolio['positions'] = ( positions.multiply( goog_data_signal['price'],
axis=0
)
)
# Next, we will calculate the non-invested money (cash or remaining cash):
# positions.diff() == goog_data_signal['positions']
# +1 : buy, -1: sell, 0:you not have any position on the market
portfolio['cash'] = initial_capital - ( positions.diff().multiply( goog_data_signal['price'],
axis=0
)
).cumsum() # if current row in the result of cumsum() <0 : +profit + cash
# if current row in the result of cumsum() >0 : -loss + cash
# The total investment will be calculated by summing the positions and the cash:
portfolio['total'] = portfolio['positions'] + portfolio['cash']
fig = plt.figure( figsize=(8,6) )
ax = fig.add_subplot( 111 )
ax.plot( goog_data_signal['Date'], portfolio)
plt.setp( ax.get_xticklabels(), rotation=45, horizontalalignment='right' )
ax.set_xlabel('Date')
# ['positions', 'cash', 'total']
ax.legend(portfolio.columns, loc='upper left')
plt.show()
stackplot: total = current cash+ current stock price
When we create a trading strategy, we have an initial amount of money (cash). We will invest this money (holdings). This holding value is based on the market value of the investment. If we own a stock and the price of this stock increases, the value of the holding will increase. When we decide to sell, we move the value of the holding corresponding to this sale to the cash amount. The sum total of the assets is the sum of the cash and the holdings. The preceding chart shows that the strategy is profitable since the amount of cash increases toward the end. The graph allows you to check whether your trading idea can generate money.
############################
In this section, we learned the difference between trend and momentum trading strategies( the trend strategy uses speed(each day price move), whereas the momentum strategy uses acceleration(rolling window)), and we implemented a very well used momentum trading strategy based on support and resistance levels. We will now explore new ideas to create trading strategies by using more technical analysis.
This section will show you how to use technical analysis to build trading signals. We will start with one of the most common methods, the simple moving average, and we will discuss more advanced techniques along the way. Here is a list of the signals we will cover:
Simple moving average, which we will refer to as SMA, is a basic technical analysis indicator. The simple moving average, as you may have guessed from its name, is computed by adding up the price of an instrument over a certain period of time divided by the number of time periods. It is basically the price average over a certain time period, with equal weight being used for each price. The time period over which it is averaged is often referred to as the lookback period or history. Let's have a look at the following formula of the simple moving average:
Here, the following applies:
Let's implement a simple moving average that computes an average over a 20-day moving window. We will then compare the SMA values against daily prices, and it should be easy to observe the smoothing that SMA achieves.
import pandas as pd
from pandas_datareader import data
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
except:
# Call the function DataReader from the class data
goog_data2 = data.DataReader( 'GOOG', # ticker
'yahoo', # source
start_date, end_date
)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data = goog_data2.tail(620)
goog_data.head()
In this section, the code demonstrates how you would implement a simple moving average, using a list (history) to maintain a moving window of prices and a list (SMA values) to maintain a list of SMA values: == goog_data['Close'].rolling(window=20, min_periods=1).mean()
close = goog_data['Close']
import statistics as stats
time_period = 20 # number of days over which to average
history = [] # to track a history of prices
sma_values = [] # to track simple moving average values
for close_price in close:
history.append( close_price )
if len(history) > time_period: # we remove oldest price because we only
del( history[0] ) # average over last ' time_period' prices
sma_values.append( stats.mean(history) )
goog_data = goog_data.assign( ClosePrice = pd.Series( close,
index = goog_data.index
)
)
goog_data = goog_data.assign( Simple20DayMovingAverage = pd.Series( sma_values,
index = goog_data.index
)
)
goog_data.head()
goog_data.tail()
close_price = goog_data['ClosePrice']
sma = goog_data['Simple20DayMovingAverage']
import matplotlib.pyplot as plt
import datetime
import matplotlib.ticker as ticker
fig = plt.figure( figsize= (10,6) )
ax1 = fig.add_subplot(111, xlabel='Date', ylabel='Google close price in $')
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='close_price' )
ax1.plot( goog_data.index.values, sma, color='r', lw=2., label='sma' )
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.legend()
plt.show()
In this plot, it is easy to observe that the 20-day SMA has the intended smoothing effect and evens out拉平 the micro-volatility in the actual stock price, yielding a more stable price curve.
goog_data['SMA_20'] = goog_data['Close'].rolling(20).mean()
goog_data[:25]
close_price = goog_data['ClosePrice']
sma = goog_data['SMA_20'] ###
import matplotlib.pyplot as plt
import datetime
import matplotlib.ticker as ticker
fig = plt.figure( figsize= (10,6) )
ax1 = fig.add_subplot(111, xlabel='Date', ylabel='Google close price in $')
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='close_price' )
ax1.plot( goog_data.index.values, sma, color='r', lw=2., label='sma' )
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.legend()
plt.show()
Note the difference from the previous sma curve: the value of sma is NaN in the first 20 days.
min_periods int, default None
Minimum number of observations in window required to have a value (otherwise result is NA). For a window that is specified by an offset, min_periods will default to 1. Otherwise, min_periods will default to the size of the window.
goog_data['SMA_20'] = goog_data['Close'].rolling(window=20, min_periods=1).mean()
goog_data[:25]
yahoo fiance uses interval = 1W to make the goog stock close price smoother, so if you use SMA=20, the curve will be more frequent, so you can only set SMA=5, so that the moving average you see will be more similar to what we drew
The exponential moving average, which we will refer to as the EMA, is the single most well-known and widely used technical analysis indicator for time series data.
The EMA is similar to the simple moving average, but, instead of weighing all prices in the history equally, it places more weight on the most recent price observation and less weight on the older price observations. This is endeavoring to capture the intuitive idea that the new price observation has more up-to-date information than prices in the past. It is also possible to place more weight on older price observations and less weight on the newer price observations. This would try to capture the idea that longer-term trends have more information than short-term volatile price movements.
The weighting depends on the selected time period of the EMA;
Based on the description of EMA, it is formulated as a weight factor, applied to new price observations and a weight factor applied to the current value of EMA() to get the new value of EMA. Since the sum of the weights should be 1 to keep the EMA units the same as price units, that is, $s, the weight factor applied to EMA() values turns out to be . Hence, we get the following two formulations of new EMA values based on old EMA values and new price observations, which are the same definitions, written in two different forms:
OR
Alternatively, we have the following:
Here, the following applies:
P : Current price of the instrument
: EMA value prior to the current price observation
: Smoothing constant, most commonly set to
n : Number of time periods (similar to what we used in the simple moving average)
Let's implement an exponential moving average with 20 days as the number of time periods to compute the average over. We will use a default smoothing factor of 2 / (n + 1) for this implementation. Similar to SMA, EMA also achieves an evening out across normal daily prices. EMA has the advantage of allowing us to weigh recent prices with higher weights than an SMA does, which does uniform weighting.
In the following code, we will see the implementation of the exponential moving average:
close = goog_data['Close']
num_periods = 20 # number of days over which to average
K = 2/(num_periods+1) # smoothing constant
ema_p = 0
ema_values = [] # to hold computed EMA values
for close_price in close:
if ema_p == 0: # first observation, EMA = current-price
ema_p = close_price
else:
ema_p = ( close_price - ema_p )*K + ema_p
ema_values.append( ema_p )
# append operation: goog_data['ClosePrice']
goog_data = goog_data.assign( ClosePrice=pd.Series( close,
index=goog_data.index
)
)
goog_data = goog_data.assign( Exponential20DayMovingAverage = pd.Series( ema_values,
index=goog_data.index
)
)
close_price = goog_data['ClosePrice']
ema = goog_data['Exponential20DayMovingAverage']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(10,6) )
ax1 = fig.add_subplot( 111 )#, xlabel='Date', ylabel='Google price in $'
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema, color='b', lw=2., label='Exponential20DayMovingAverage' )
ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.legend()
plt.show()
https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.ewm.html
adjust bool, default True
Divide by decaying adjustment factor除以衰减调整因子 in beginning periods to account for imbalance in relative weightings (viewing EWMA as a moving average).
When adjust=True
(default), the EW function is calculated using weights .
For example, the EW moving average of the series [x0,x1,...,xt] (or a price list) of the instrument would be:
################
where 0≤α≤1 is the smoothing parameter. The one-step-ahead forecast for time T+1 is a weighted average of all of the observations in the series ,…,. The rate at which the weights decrease is controlled by the parameter α.
################
等式的上方是对当前价格到最初价格的加权求和,用的加权因子是
等式的下方是对所有加权因子的求和(可用等比例求和公式)
等比例求和公式推导
and(公比)==>
When adjust=False
, the exponentially weighted function is calculated recursively:
OR
close = goog_data['Close']
num_periods = 20 # number of days over which to average
goog_data['close_20_ema'] = goog_data['Close'].ewm( ignore_na=False,
span=num_periods, # K = 2/(num_periods+1) # smoothing constant
min_periods=0,
adjust=False ###
).mean()
goog_data.head(21)
if adjust=True: https://blog.csdn.net/Linli522362242/article/details/121172551
close = goog_data['Close']
num_periods = 20 # number of days over which to average
goog_data['close_20_ema'] = goog_data['Close'].ewm( ignore_na=False,
span=num_periods, # K = 2/(num_periods+1) # smoothing constant
min_periods=0,
adjust=True ###
).mean()
close_price = goog_data['ClosePrice']
ema = goog_data['Exponential20DayMovingAverage']
ema_20 = goog_data['close_20_ema']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(10,6) )
ax1 = fig.add_subplot( 111 )#, xlabel='Date', ylabel='Google price in $'
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema, color='b', lw=2., label='Exponential20DayMovingAverage' )
ax1.plot( goog_data.index.values, ema_20, color='k', lw=2., label='close_20_ewma' )
ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.legend()
Adjust=True and Adjust=False only have the initial difference, and then the same. The initial ewma is closer to the price trend
goog_data.tail()
%timeit goog_data['Close'].ewm( ignore_na=False,span=num_periods, min_periods=0,adjust=True ).mean()
Faster!
%timeit goog_data['Close'].ewm( ignore_na=False,span=num_periods, min_periods=0,adjust=False ).mean()
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(12,8) )
ax1 = fig.add_subplot( 111 )#, xlabel='Date', ylabel='Google price in $'
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema, color='b', lw=2., label='Exponential20DayMovingAverage' )
ax1.plot( goog_data.index.values, ema_20, color='k', lw=2., label='close_20_ewma' )
ax1.plot( goog_data.index.values, sma, color='y', lw=2., label='sma' )
ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.legend()
plt.show()
From the plot, it is observed that EMA has a very similar smoothing effect to SMA(ewma better than sma), as expected, and it reduces the noise in the raw prices. However the extra parameter, , available in EMA in addition to the parameter n, allows us to control the relative weight placed on the new price observation, as compared to older price observations. This allows us to build different variants of EMA by varying the parameter to make fast and slow EMAs, even for the same parameter, We will explore fast and slow EMAs more in the rest of this chapter and in later chapters.
The absolute price oscillator, which we will refer to as APO, is a class of indicators that builds on top of moving averages of prices to capture specific short-term deviations in prices.
The absolute price oscillator is computed by finding the difference between a fast exponential moving average and a slow exponential moving average. Intuitively, it is trying to measure how far the more reactive EMA () is deviating from the more stable EMA (). A large difference is usually interpreted as one of two things: instrument prices are starting to trend or break out, or instrument prices are far away from their equilibrium prices, in other words, overbought or oversold:
Let's now implement the absolute price oscillator, with the faster EMA using a period of 10 days and a slower EMA using a period of 40 days, and default smoothing factors being 2/11 and 2/41, respectively, for the two EMAs:
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
close = goog_data['Close']
num_periods_fast = 10 # time period for the fast EMA
K_fast = 2/(num_periods_fast+1) # smoothing factor for fast EMA
ema_fast = 0 # initial ema
num_periods_slow = 40 # time period for slow EMA
K_slow = 2/(num_periods_slow+1) # smoothing factor for slow EMA
ema_slow = 0 # initial ema
ema_fast_values = [] # we will hold fast EMA values for visualization purposes
ema_slow_values = [] # we will hold slow EMA values for visualization purposes
apo_values = [] # track computed absolute price oscillator values
for close_price in close:
if ema_fast == 0: # first observation
ema_fast = close_price
ema_slow = close_price
else:
ema_fast = (close_price - ema_fast) * K_fast + ema_fast
ema_slow = (close_price - ema_slow) * K_slow + ema_slow
ema_fast_values.append( ema_fast )
ema_slow_values.append( ema_slow )
apo_values.append( ema_fast - ema_slow )
The preceding code generates APO values that have higher positive and negative values when the prices are moving away from long-term EMA(here, num_periods_slow=40) very quickly (breaking out), which can have a trend-starting interpretation or an overbought/sold interpretation. Now, let's visualize the fast and slow EMAs and visualize the APO values generated:
goog_data = goog_data.assign( ClosePrice=pd.Series(close,
index=goog_data.index
)
)
goog_data = goog_data.assign( FastExponential10DayMovingAverage = pd.Series( ema_fast_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( SlowExponential40DayMovingAverage = pd.Series( ema_slow_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( AbsolutePriceOscillator = pd.Series( apo_values,
index=goog_data.index
)
)
close_price = goog_data['ClosePrice']
ema_f = goog_data['FastExponential10DayMovingAverage']
ema_s = goog_data['SlowExponential40DayMovingAverage']
apo = goog_data['AbsolutePriceOscillator']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,8) )
ax1 = fig.add_subplot(211)
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema_f, color='b', lw=2., label='FastExponential_10_DayMovingAverage' )
ax1.plot( goog_data.index.values, ema_s, color='k', lw=2., label='SlowExponential_40_DayMovingAverage' )
# ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.legend()
ax2 = fig.add_subplot( 212 )
ax2.plot( goog_data.index.values, apo, color='k', lw=2., label='AbsolutePriceOscillator')
ax2.set_ylabel('APO', fontsize=12)
ax2.set_xlabel('Date', fontsize=12)
ax2.legend()
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
ax2.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax2.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax2.margins(0,0.05) # move all curves to up
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
ax2.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax2.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.show()
One observation here is the difference in behavior between fast and slow EMAs. The faster one is more reactive to new price observations, and the slower one is less reactive to new price observations and decays slower.
The moving average convergence divergence is another in the class of indicators that builds on top of moving averages of prices. We'll refer to it as MACD. This goes a step further than the APO. Let's look at it in greater detail.
The moving average convergence divergence was created by Gerald Appel. It is similar in spirit to an absolute price oscillator in that it establishes the difference between a fast exponential moving average and a slow exponential moving average. However, in the case of MACD, we apply a smoothing exponential moving average to the MACD value itself in order to get the final signal output from the MACD indicator. Optionally, you may also look at the difference between MACD values and the EMA of the MACD values (signal) and visualize it as a histogram. A properly configured MACD signal can successfully capture the direction, magnitude, and duration of a trending instrument price:
MACD_EMA_SHORT = 12
MACD_EMA_LONG = 26
MACD_EMA_SIGNAL = 9
@classmethod
def _get_macd(cls, df):
""" Moving Average Convergence Divergence
This function will initialize all following columns.
MACD Line (macd): (12-day EMA - 26-day EMA)
Signal Line (macds): 9-day EMA of MACD Line
MACD Histogram (macdh): MACD Line - Signal Line
:param df: data
:return: None
"""
ema_short = 'close_{}_ema'.format(cls.MACD_EMA_SHORT)
ema_long = 'close_{}_ema'.format(cls.MACD_EMA_LONG)
ema_signal = 'macd_{}_ema'.format(cls.MACD_EMA_SIGNAL)
fast = df[ema_short]
slow = df[ema_long]
df['macd'] = fast - slow
df['macds'] = df[ema_signal]
df['macdh'] = (df['macd'] - df['macds'])
cls._drop_columns(df, [ema_short, ema_long, ema_signal])
Let's implement a moving average convergence divergence signal with a fast EMA period of 10 days, a slow EMA period of 40 days, and with default smoothing factors of 2/11 and 2/41, respectively:
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
close = goog_data['Close']
num_periods_fast = 10 # time period for the fast EMA
K_fast = 2/(num_periods_fast+1) # smoothing factor for fast EMA
ema_fast = 0 # initial ema
num_periods_slow = 40 # time period for slow EMA
K_slow = 2/(num_periods_slow+1) # smoothing factor for slow EMA
ema_slow = 0 # initial ema
num_periods_macd = 20 # MACD ema time period
K_macd = 2/(num_periods_macd+1) # MACD EMA smoothing factor
ema_macd= 0
ema_fast_values = [] # we will hold fast EMA values for visualization purposes
ema_slow_values = [] # we will hold slow EMA values for visualization purposes
macd_values = [] # tract MACD values for visualization purpose # MACD = EMA_fast - EMA_slow
macd_signal_values = [] # MACD EMA values tracker # MACD_signal = EMA_MACD
macd_histogram_values = [] # MACD = MACD - MACD_signal
for close_price in close:
if ema_fast == 0: # first observation
ema_fast = close_price
ema_slow = close_price
else:
ema_fast = (close_price - ema_fast) * K_fast + ema_fast
ema_slow = (close_price - ema_slow) * K_slow + ema_slow
ema_fast_values.append( ema_fast )
ema_slow_values.append( ema_slow )
macd = ema_fast - ema_slow # MACD is fast_MA - slow_EMA # apo_values
if ema_macd == 0 :
ema_macd = macd
else:
ema_macd = (macd-ema_macd) * K_macd + ema_macd # signal is EMA of MACD values
macd_values.append( macd )
macd_signal_values.append( ema_macd )
macd_histogram_values.append( macd-ema_macd )
In the preceding code, the following applies:
Let's look at the code to plot and visualize the different signals and see what we can understand from it:
goog_data = goog_data.assign( ClosePrice=pd.Series(close,
index=goog_data.index
)
)
goog_data = goog_data.assign( FastExponential10DayMovingAverage = pd.Series( ema_fast_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( SlowExponential40DayMovingAverage = pd.Series( ema_slow_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( MovingAverageConvergenceDivergence = pd.Series( macd_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( Exponential20DayMovingAverageOfMACD = pd.Series( macd_signal_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( MACDHistorgram = pd.Series( macd_histogram_values,
index=goog_data.index
)
)
close_price = goog_data['ClosePrice']
ema_f = goog_data['FastExponential10DayMovingAverage']
ema_s = goog_data['SlowExponential40DayMovingAverage']
macd = goog_data['MovingAverageConvergenceDivergence']
ema_macd = goog_data['Exponential20DayMovingAverageOfMACD']
macd_histogram = goog_data['MACDHistorgram']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,8) )
ax1 = fig.add_subplot(311)
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema_f, color='b', lw=2.,
label='FastExponential_{}_DayMovingAverage'.format(num_periods_fast) )
ax1.plot( goog_data.index.values, ema_s, color='k', lw=2.,
label='SlowExponential_{}_DayMovingAverage'.format(num_periods_slow) )
# ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.legend()
ax2 = fig.add_subplot( 312 )
ax2.plot( goog_data.index.values, macd, color='k', lw=2., label='MovingAverageConvergenceDivergence' )
ax2.plot( goog_data.index.values, ema_macd, color='g', lw=2.,
label='Exponential_{}_DayMovingAverageOfMACD'.format(num_periods_macd))
#ax2.axhline( y=0, lw=2, color='0.7' )
ax2.set_ylabel('MACD', fontsize=12)
ax2.legend()
ax3 = fig.add_subplot( 313 )
ax3.bar( goog_data.index.values, macd_histogram, color='r', label='MACDHistorgram', width=0.9 )
ax3.set_ylabel('MACD', fontsize=12)
ax3.legend()
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
ax2.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax2.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax2.margins(0,0.05) # move all curves to up
ax3.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax3.margins(0,0.05) # move all curves to up
ax3.set_xticks([])#plt.xticks([]) ###
ax3.set_ylim(bottom=-30, top=30)
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
ax2.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax2.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 )
plt.show()
The preceding code will return the following output. Let's have a look at the plot:
The MACD signal is very similar to the APO, as we expected, but now, in addition, the is an additional smoothing factor on top of raw MACD values to capture lasting trending periods by smoothing out the noise of raw values. Finally, the , which is the difference in the two series, captures
MACD在应用上应先行计算出快速(一般选12日)移动平均值与慢速(一般选26日)移动平均值。以这两个数值作为测量两者(快速与慢速线)间的“差离值”依据。所谓“差离值”(DIF),即12日EMA数值减去26日EMA数值。因此,在持续的涨势中,12日EMA在26日EMA之上。其间的正差离值(+DIF)会愈来愈大。反之在跌势中,差离值可能变负(-DIF),此时是绝对值愈来愈大。至于行情开始回转,正或负差离值要缩小到一定的程度,才真正是行情反转的信号。MACD的反转信号界定为“差离值”的9日移动平均值MACD_ema(9日DIF)。 在MACD的异同移动平均线计算公式中,都分别加T+1交易日的份量权值,以现在流行的参数12和26为例,
close = goog_data['Close']
num_periods_fast = 12 # time period for the fast EMA
K_fast = 2/(num_periods_fast+1) # smoothing factor for fast EMA
ema_fast = 0 # initial ema
num_periods_slow = 26 # time period for slow EMA
K_slow = 2/(num_periods_slow+1) # smoothing factor for slow EMA
ema_slow = 0 # initial ema
num_periods_macd = 9 # MACD ema time period
K_macd = 2/(num_periods_macd+1) # MACD EMA smoothing factor
ema_macd= 0
ema_fast_values = [] # we will hold fast EMA values for visualization purposes
ema_slow_values = [] # we will hold slow EMA values for visualization purposes
macd_values = [] # tract MACD values for visualization purpose # MACD = EMA_fast - EMA_slow
macd_signal_values = [] # MACD EMA values tracker # MACD_signal = EMA_MACD
macd_histogram_values = [] # MACD = MACD - MACD_signal
for close_price in close:
if ema_fast == 0: # first observation
ema_fast = close_price
ema_slow = close_price
else:
ema_fast = (close_price - ema_fast) * K_fast + ema_fast
ema_slow = (close_price - ema_slow) * K_slow + ema_slow
ema_fast_values.append( ema_fast )
ema_slow_values.append( ema_slow )
macd = ema_fast - ema_slow # MACD is fast_MA - slow_EMA # apo_values
if ema_macd == 0 :
ema_macd = macd
else:
ema_macd = (macd-ema_macd) * K_macd + ema_macd # signal is EMA of MACD values
macd_values.append( macd )
macd_signal_values.append( ema_macd )
macd_histogram_values.append( macd-ema_macd )
goog_data = goog_data.assign( ClosePrice=pd.Series(close,
index=goog_data.index
)
)
goog_data = goog_data.assign( FastExponential10DayMovingAverage = pd.Series( ema_fast_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( SlowExponential40DayMovingAverage = pd.Series( ema_slow_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( MovingAverageConvergenceDivergence = pd.Series( macd_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( Exponential20DayMovingAverageOfMACD = pd.Series( macd_signal_values,
index=goog_data.index
)
)
goog_data = goog_data.assign( MACDHistorgram = pd.Series( macd_histogram_values,
index=goog_data.index
)
)
close_price = goog_data['ClosePrice']
ema_f = goog_data['FastExponential10DayMovingAverage']
ema_s = goog_data['SlowExponential40DayMovingAverage']
macd = goog_data['MovingAverageConvergenceDivergence']
ema_macd = goog_data['Exponential20DayMovingAverageOfMACD']
macd_histogram = goog_data['MACDHistorgram']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,8) )
ax1 = fig.add_subplot(311)
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, ema_f, color='b', lw=2.,
label='FastExponential_{}_DayMovingAverage'.format(num_periods_fast) )
ax1.plot( goog_data.index.values, ema_s, color='k', lw=2.,
label='SlowExponential_{}_DayMovingAverage'.format(num_periods_slow) )
ax1.set_xlabel('Date',fontsize=12)
ax1.set_ylabel('Google price in $',fontsize=12)
ax1.legend()
ax2 = fig.add_subplot( 312 )
ax2.plot( goog_data.index.values, macd, color='k', lw=2., label='MovingAverageConvergenceDivergence' )
ax2.plot( goog_data.index.values, ema_macd, color='g', lw=2.,
label='Exponential_{}_DayMovingAverageOfMACD'.format(num_periods_macd))
#ax2.axhline( y=0, lw=2, color='0.7' )
ax2.set_ylabel('MACD', fontsize=12)
ax2.legend()
ax3 = fig.add_subplot( 313 )
ax3.bar( goog_data.index.values, macd_histogram, color='r', label='MACDHistorgram', width=0.9 )
ax3.set_ylabel('MACD', fontsize=12)
ax3.legend()
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
ax2.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax2.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax2.margins(0,0.05) # move all curves to up
ax3.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax3.margins(0,0.05) # move all curves to up
ax3.set_xticks([])#plt.xticks([]) ###
ax3.set_ylim(bottom=-30, top=30)
from matplotlib.dates import DateFormatter
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
ax2.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax2.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 )
plt.show()
故MACD指标是由两线一柱组合起来形成,快速线(黑色线)为DIF(MovingAverageConvergenceDivergence),慢速线(绿色线)为DEA(MACD_ema),柱状图为MACD。在各类投资中,有以下方法供投资者参考:https://baike.baidu.com/item/MACD%E6%8C%87%E6%A0%87/6271283?fromtitle=MACD&fromid=3334786&fr=aladdin
1.当DIF和DEA均大于0(即在图形上表示为它们处于零线以上)并向上移动时,一般表示为行情处于多头行情中,可以买入开仓或多头持仓;
2.当DIF和DEA均小于0(即在图形上表示为它们处于零线以下)并向下移动时,一般表示为行情处于空头行情中,可以卖出开仓或观望。
3.当DIF和DEA均大于0(即在图形上表示为它们处于零线以上)但都向下移动时,一般表示为行情处于下跌阶段,可以卖出开仓和观望;
4.当DIF和DEA均小于0时(即在图形上表示为它们处于零线以下)但向上移动时,一般表示为行情即将上涨,股票将上涨,可以买入开仓或多头持仓。
指数平滑异同移动平均线,简称MACD,它是一项利用短期指数平均数指标与长期指数平均数指标之间的聚合与分离状况,对买进、卖出时机作出研判的技术指标。
根据移动平均线原理所发展出来的MACD,一来克服了移动平均线假信号频繁的缺陷,二来能确保移动平均线最大的战果。
其买卖原则为:
1.DIF(MovingAverageConvergenceDivergence)、DEA((MACD_ema))均为正,DIF向上突破DEA,买入信号参考。
2.DIF、DEA均为负,DIF向下跌破DEA,卖出信号参考。
3.DIF线与K线发生背离,行情可能出现反转信号。
4.DIF、DEA的值从正数变成负数,或者从负数变成正数并不是交易信号,因为它们落后于市场。
1. MACD金叉:DIFF 由下向上突破 DEA,为买入信号。
2. MACD死叉:DIFF 由上向下突破 DEA,为卖出信号。
3. MACD 绿转红:MACD(bar) 值由负变正,市场由空头转为多头。
4. MACD 红转绿:MACD(bar) 值由正变负,市场由多头转为空头。
5. DIFF 与 DEA 均为正值,即都在零轴线以上时,大势属多头市场,DIFF 向上突破 DEA,可作买入信号。
6. DIFF 与 DEA 均为负值,即都在零轴线以下时,大势属空头市场,DIFF 向下跌破 DEA,可作卖出信号。
7. 当 DEA 线与 K 线趋势发生背离时为反转信号。
8. DEA 在盘整局面时失误率较高,但如果配合RSI 及KDJ指标可适当弥补缺点。
⒈由于MACD是一项中、长线指标,买进点、卖出点和最低价、最高价之间的价差较大。当行情忽上忽下幅度太小或盘整时,按照信号进场后随即又要出场,买卖之间可能没有利润,也许还要赔点价差或手续费。
⒉一两天内涨跌幅度特别大时,MACD来不及反应,因为MACD的移动相当缓和,比较行情的移动有一定的时间差,所以一旦行情迅速大幅涨跌,MACD不会立即产生信号,此时,MACD无法发生作用。
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
Bollinger bands (BBANDS) also builds on top of moving averages, but incorporates recent price volatility that makes the indicator more adaptive to different market conditions. Let's now discuss this in greater detail.
Bollinger bands is a well-known technical analysis indicator developed by John Bollinger. It
This band represents the expected volatility of the prices by treating the moving average of the price as the reference price.
Now, when prices move outside of these bands, that can be interpreted as a breakout/trend signal or an overbought/sold mean reversion逆转 signal.
Let's look at the equations to compute the upper Bollinger band, , and the lower Bollinger band, . Both depend, in the first instance, on the middle Bollinger band,, which is simply the simple moving average of the previous time periods( in this case, the last days ) denoted by. The upper and lower bands are then computed by adding/subtracting to , which is the product of standard deviation, , which we've seen before, and , which is a standard deviation factor of our choice. The larger the value of chosen, the greater the Bollinger bandwidth for our signal, so it is just a parameter that controls the width in our trading signal:
Here, the following applies:
: Standard deviation factor of our choice
To compute the standard deviation, first we compute the variance:
Then, the standard deviation is simply the square root of the variance:
We will implement and visualize Bollinger bands, with 20 days as the time period for SMA ( ):
In the preceding code, we used a stdev factor, , of 2 to compute the upper band and lower band from the middle band, and the standard deviation we compute.
import statistics as stats
import math as math
close = goog_data['Close']
time_period = 20 # history length for Simple Moving Average for middle ban
stdev_factor = 2 # Standard Deviation Scaling factor for the upper and lower bands
history = [] # price history for computing simple moving average
sma_values = [] # moving average of prices for visualization purposes
upper_band = [] # upper band values
lower_band = [] # lower band values
for close_price in close:
# step1: sma
history.append( close_price )
if len(history) > time_period: # only maintain at most 'time_period' number of price observations
del (history[0])
sma = stats.mean( history )
sma_values.append( sma ) # simple moving average or middle band
# step2: stdev
variance = 0 # variance is the square of standard deviation
for hist_price in history:
variance += ( (hist_price-sma)**2 )
stdev = math.sqrt( variance/len(history) ) # square root to get standard deviation
# step3:
upper_band.append( sma + stdev_factor*stdev )
lower_band.append( sma - stdev_factor*stdev )
Now, let's add some code to visualize the Bollinger bands and make some observations:
goog_data = goog_data.assign( ClosePrice = pd.Series( close,
index = goog_data.index
)
)
goog_data = goog_data.assign( MiddleBollingerBand_20DaySMA = pd.Series( sma_values,
index = goog_data.index
)
)
goog_data = goog_data.assign( UpperBollingerBand_20DaySMA_2StdevFactor = pd.Series( upper_band,
index = goog_data.index
)
)
goog_data = goog_data.assign( LowerBollingerBand_20DaySMA_2StdevFactor = pd.Series( lower_band,
index = goog_data.index
)
)
close_price = goog_data['ClosePrice']
boll_m = goog_data['MiddleBollingerBand_20DaySMA']
boll_ub = goog_data['UpperBollingerBand_20DaySMA_2StdevFactor']
boll_lb = goog_data['LowerBollingerBand_20DaySMA_2StdevFactor']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(12,6) )
ax1 = fig.add_subplot(111)
ax1.plot( goog_data.index.values, close_price, color='k', lw=2., label='ClosePrice' )
ax1.plot( goog_data.index.values, boll_m, color='b', lw=2., label='MiddleBollingerBand_20DaySMA')
ax1.plot( goog_data.index.values, boll_ub, color='g', lw=2., label='UpperBollingerBand_20DaySMA_2StdevFactor')
ax1.plot( goog_data.index.values, boll_lb, color='r', lw=2., label='LowerBollingerBand_20DaySMA_2StdevFactor')
ax1.fill_between( goog_data.index.values, boll_ub, boll_lb, alpha=0.1 )
ax1.set_xlabel('Date', fontsize=12)
ax1.set_ylabel('Google price in $', fontsize=12)
ax1.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax1.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax1.margins(0,0.05) # move all curves to up
ax1.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax1.get_xticklabels(), rotation=30, horizontalalignment='right' )
ax1.legend()
plt.show()
股价波动在上限和下限的区间之内,这条带状区的宽窄,随着股价波动幅度的大小而变化,股价涨跌幅度加大时,带状区变宽,涨跌幅度狭小盘整时,带状区则变窄。
布林线利用波带可以显示其安全的高低价位。
当变易性变小,而波带变窄时,激烈的价格波动有可能随时产生。
高,低点穿越波带边线时,立刻又回到波带内,会有回档产生。
波带开始移动后,以此方式进入另一波带,这对于找出目标值有相当帮助。
应用规则是这样的:当一只股票在一段时间内股价波幅很小,反映在布林线上表现为,股价波幅带长期收窄,而在某个交易日,股价在较大交易量的配合下收盘价突破布林线的阻力线,而此时布林线由收口明显转为开口,此时投资者应该果断买入(从当日的K线图就可明显看出),这是因为,该股票由弱转强,短期上冲的动力不会仅仅一天,短线必然会有新高出现,因此可以果断介入。
For Bollinger bands, when prices stay within the upper and lower bounds, then not much can be said, but, when prices traverse the upper band, then one interpretation can be that prices are breaking out to the upside and will continue to do so. Another interpretation of the same event can be that the trading instrument is overbought and we should expect a bounce back down.
The other case is when prices traverse the lower band, then one interpretation can be that prices are breaking out to the downside and will continue to do so. Another interpretation of the same event can be that the trading instrument is oversold and we should expect a bounce back up. In either case, Bollinger bands helps us to quantify and capture the exact time when this happens.
BOLL指标应用技巧
1)、当价格运行在布林通道的中轨和上轨之间的区域时,只要不破中轨,说明市场处于多头行情中,只考虑逢低买进,不考虑做空。
2)、在中轨和下轨之间时,只要不破中轨(高出中轨线,中轨线是20天的简单移动平均线SMA_20),说明是空头市场,交易策略是逢高卖出,不考虑买进。
3)、当市场价格沿着布林通道上轨运行时,说明市场是单边上涨行情,持有的多单要守住,只要价格不脱离上轨区域就耐心持有。
4)、沿着下轨运行时,说明市场目前为单边下跌行情,一般为一波快速下跌行情,持有的空单(做空),只要价格不脱离下轨区域就耐心持有。
5)、当价格运行在中轨区域时,说明市场目前为盘整震荡行情,对趋势交易者来说,这是最容易赔钱的一种行情,应回避,空仓观望为上。
6)、布林通道的缩口状态。价格在中轨附近震荡,上下轨逐渐缩口,此是大行情来临的预兆,应空仓观望,等待时机。
7)、通道缩口后的突然扩张状态。意味着一波爆发性行情来临,此后,行情很可能走单边,可以积极调整建仓,顺势而为。
8)、当布林通道缩口后,在一波大行情来临之前,往往会出现假突破行情,这是主力的陷阱,应提高警惕,可以通过调整仓位化解。
9)、布林通道的时间周期应以周线为主,在单边行情时,所持仓单已有高额利润,为防止大的回调,可以参考日线布林通道的原则出局。
The relative strength indicator, which we will refer to as RSI, is quite different from the previous indicators we saw that were based on moving averages of prices. This is based on price changes over periods to capture the strength/magnitude of price moves.
The relative strength indicator was developed by J Welles Wilder. It comprises a lookback period, which it uses to compute the magnitude of the average of gains/price increases over that period, as well as the magnitude of the averages of losses/price decreases over that period. Then, it computes the RSI value that normalizes the signal value to stay between 0 and 100, and attempts to capture if there have been many more gains relative to the losses, or if there have been many more losses relative to the gains. RSI values over 50% indicate an uptrend, while RSI values below 50% indicate a downtrend.
For the last n periods, the following applies:
Otherwise, the following applies:
Otherwise, the following applies:
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
Now, let's implement and plot a relative strength indicator on our dataset:
avg_gain or avg_gain use the simple average(sma)
import statistics as stats
time_period = 20 # look back period to compute gains & losses
gain_history = [] # history of gains over look back period (0 if no gain, magnitude of gain if gain)
loss_history = [] # history of losses over look back period (0 if no loss, magnitude of loss if loss)
avg_gain_values = [] # track avg gains for visualization purposes
avg_loss_values = [] # track avg losses for visualization purposes
rsi_values = [] # track computed RSI values
last_price = 0 # current_price - last_price > 0 => gain.
# current_price - last_price < 0 => loss.
for close_price in close:
if last_price ==0:
last_price = close_price
gain_history.append( max(0, close_price-last_price) )
loss_history.append( max(0, last_price-close_price) )
last_price = close_price
if len(gain_history) > time_period: # maximum observations is equal to lookback period
del ( gain_history[0] )
del ( loss_history[0] )
avg_gain = stats.mean( gain_history ) # average gain over lookback period
avg_loss = stats.mean( loss_history ) # average loss over lookback period
avg_gain_values.append( avg_gain )
avg_loss_values.append( avg_loss )
rs = 0
if avg_loss > 0: # to avoid division by 0, which is undefined
rs = avg_gain/avg_loss
rsi = 100 - ( 100/(1+rs) )
rsi_values.append( rsi )
In the preceding code, the following applies:
Now, let's look at the code to visualize the final signal as well as the components involved:
goog_data = goog_data.assign( ClosePrice = pd.Series( close,
index = goog_data.index
)
)
goog_data = goog_data.assign( RelativeStrengthAvg_GainOver_20Days = pd.Series( avg_gain_values,
index = goog_data.index
)
)
goog_data = goog_data.assign( RelativeStrengthAvg_LossOver_20Days = pd.Series( avg_loss_values,
index = goog_data.index
)
)
goog_data = goog_data.assign( RelativeStrength_IndicatorOver_20Days = pd.Series( rsi_values,
index = goog_data.index
)
)
close_price = goog_data['ClosePrice']
rs_gain = goog_data['RelativeStrengthAvg_GainOver_20Days']
rs_loss = goog_data['RelativeStrengthAvg_LossOver_20Days']
rsi = goog_data['RelativeStrength_IndicatorOver_20Days']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,10) )
ax1 = fig.add_subplot( 311 )
ax1.plot( goog_data.index.values, close_price, color='k', lw=2., label='ClosePrice' )
ax1.set_ylabel( 'Google price in $', fontsize=12 )
ax1.legend()
ax2 = fig.add_subplot( 312 )
ax2.plot( goog_data.index.values, rs_gain, color='g', lw=2., label='RelativeStrengthAvg_GainOver_20Days' )
ax2.plot( goog_data.index.values, rs_loss, color='r', lw=2., label='RelativeStrengthAvg_LossOver_20Days' )
ax2.set_ylabel( 'RS', fontsize=12 )
ax2.legend()
ax3 = fig.add_subplot( 313 )
ax3.plot( goog_data.index.values, rsi, color='b', lw=2., label='RelativeStrength_IndicatorOver_20Days' )
ax3.axhline( y=50, lw=2, color='0.7' )
ax3.set_ylabel( 'RSI', fontsize=12 )
ax3.legend()
from matplotlib.dates import DateFormatter
for ax in(ax1, ax2, ax3):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 ) # space between axes
plt.show()
The preceding code will return the following output. Let's have a look at the plot:
goog_data[goog_data['RelativeStrength_IndicatorOver_20Days']>50].count(axis=0)['RelativeStrength_IndicatorOver_20Days'] /\
goog_data[goog_data['RelativeStrength_IndicatorOver_20Days']<=50].count(axis=0)['RelativeStrength_IndicatorOver_20Days']
The first observation we can make from our analysis of the RSI signal applied to our GOOGLE dataset is that the AverageGain over our time frame of 20 days more often than not exceeds the AverageLoss over the same time frame, which intuitively makes sense because Google has been a very successful stock, increasing in value more or less consistently. Based on that, the RSI indicator also stays above 50% for the majority of the lifetime of the stock(1.6956521739130435)again reflecting the continued gains in the Google stock over the course of its lifetime.
def _get_smma(cls, df, column, windows):
""" get smoothed moving average.
:param df: data
:param windows: range
:return: result series
"""
window = cls.get_only_one_positive_int(windows)
column_name = '{}_{}_smma'.format(column, window)
smma = df[column].ewm(
ignore_na=False, alpha=1.0 / window,
min_periods=0, adjust=True).mean()
df[column_name] = smma
return smma
def _get_rsi(cls, df, n_days):
""" Calculate the RSI (Relative Strength Index) within N days
calculated based on the formula at:
https://en.wikipedia.org/wiki/Relative_strength_index
:param df: data
:param n_days: N days
:return: None
"""
n_days = int(n_days)
d = df['close_-1_d']
df['closepm'] = (d + d.abs()) / 2
df['closenm'] = (-d + d.abs()) / 2
closepm_smma_column = 'closepm_{}_smma'.format(n_days)
closenm_smma_column = 'closenm_{}_smma'.format(n_days)
p_ema = df[closepm_smma_column]
n_ema = df[closenm_smma_column]
rs_column_name = 'rs_{}'.format(n_days)
rsi_column_name = 'rsi_{}'.format(n_days)
df[rs_column_name] = rs = p_ema / n_ema
df[rsi_column_name] = 100 - 100 / (1.0 + rs)
columns_to_remove = ['closepm',
'closenm',
closepm_smma_column,
closenm_smma_column]
cls._drop_columns(df, columns_to_remove)
n_days_7=7
n_days_14=14
n_days_20 = 20
# # close_-1_d — this is the price difference between time t and t-1
goog_data['close_-1_s'] = goog_data['Close'].shift(1)
d = goog_data['close_-1_d'] = goog_data['Close']-goog_data['close_-1_s']
goog_data['closepm'] = ( d+d.abs() )/2 # if d>0: (d+d)/2= d, if d<0, (d+(-d))/2= 0
goog_data['closenm'] = ( -d+d.abs() )/2 # if d>0: (-d+d)/= 0, if d<0, ((-d)+(-d))/2= -d (>0)
for n_days in (n_days_20,):
p_ema = goog_data['closepm'].ewm( com = n_days - 1,
min_periods=0, # default 0
adjust=True,
).mean()
n_ema = goog_data['closenm'].ewm( com = n_days - 1,
min_periods=0,
adjust=True,
).mean()
rs_column_name = 'rs_{}'.format(n_days)
rsi_column_name = 'rsi_{}'.format(n_days)
goog_data['p_ema'] = p_ema
goog_data['n_ema'] = n_ema
goog_data[rs_column_name] = rs = p_ema / n_ema
goog_data[rsi_column_name] = 100 - 100 / (1.0 + rs)
goog_data=goog_data.drop(['closepm','closenm','close_-1_s', 'close_-1_d'], axis=1)
goog_data[['RelativeStrengthAvg_GainOver_20Days',
'p_ema',
'RelativeStrengthAvg_LossOver_20Days',
'n_ema',
'RelativeStrength_IndicatorOver_20Days',
'rsi_20'
]
].head(25)
n_days_7=7
n_days_14=14
n_days_20 = 20
# # close_-1_d — this is the price difference between time t and t-1
goog_data['close_-1_s'] = goog_data['Close'].shift(1)
d = goog_data['close_-1_d'] = goog_data['Close']-goog_data['close_-1_s']
goog_data['closepm'] = ( d+d.abs() )/2 # if d>0: (d+d)/2= d, if d<0, (d+(-d))/2= 0
goog_data['closenm'] = ( -d+d.abs() )/2 # if d>0: (-d+d)/= 0, if d<0, ((-d)+(-d))/2= -d (>0)
for n_days in (n_days_20,):
p_ema = goog_data['closepm'].ewm( com = n_days - 1,
min_periods=0, # default 0
adjust=True,
).mean()
n_ema = goog_data['closenm'].ewm( com = n_days - 1,
min_periods=0,
adjust=True,
).mean()
rs_column_name = 'rs_{}'.format(n_days)
rsi_column_name = 'rsi_{}'.format(n_days)
goog_data['p_ema'] = p_ema
goog_data['n_ema'] = n_ema
goog_data[rs_column_name] = rs = p_ema / n_ema
goog_data[rsi_column_name] = 100 - 100 / (1.0 + rs)
goog_data=goog_data.drop(['closepm','closenm','close_-1_s', 'close_-1_d'], axis=1)
# goog_data[['RelativeStrengthAvg_GainOver_20Days',
# 'p_ema',
# 'RelativeStrengthAvg_LossOver_20Days',
# 'n_ema',
# 'RelativeStrength_IndicatorOver_20Days',
# 'rsi_20'
# ]
# ].head(25)
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,10) )
ax1 = fig.add_subplot( 311 )
ax1.plot( goog_data.index.values, close_price, color='k', lw=2., label='ClosePrice' )
ax1.set_ylabel( 'Google price in $', fontsize=12 )
ax1.legend()
ax2 = fig.add_subplot( 312 )
ax2.plot( goog_data.index.values, goog_data['p_ema'], color='g', lw=2., label='p_ema_20day' )
ax2.plot( goog_data.index.values, goog_data['n_ema'], color='r', lw=2., label='n_ema_20day' )
ax2.set_ylabel( 'RS', fontsize=12 )
ax2.legend()
ax3 = fig.add_subplot( 313 )
ax3.plot( goog_data.index.values, goog_data['rsi_20'], color='b', lw=2., label='rsi_20' )
ax3.plot( goog_data.index.values, rsi, color='r', lw=2., label='RelativeStrength_IndicatorOver_20Days' )
ax3.axhline( y=30, lw=2, color='0.7') # Line for oversold threshold
ax3.axhline( y=50, lw=2, linestyle='--', color='0.8' ) # Neutral RSI
ax3.axhline( y=70, lw=2, color='0.7') # Line for overbought threshold
ax3.set_ylabel( 'RSI', fontsize=12 )
ax3.legend()
from matplotlib.dates import DateFormatter
for ax in(ax1, ax2, ax3):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 ) # space between axes
plt.show()
Readings below 30 generally indicate that the stock is oversold, while readings above 70 indicate that it is overbought. Traders will often place this RSI chart below the price chart for the security, so they can compare its recent momentum against its market price
Some traders will consider it a “buy signal” if a security’s RSI reading moves below 30, based on the idea that the security has been oversold and is therefore poised for a rebound. However, the reliability of this signal will depend in part on the overall context. If the security is caught in a significant downtrend, then it might continue trading at an oversold level for quite some time. Traders in that situation might delay buying until they see other confirmatory signals.
IF PREVIOUS RSI > 30 AND CURRENT RSI < 30 ==> BUY SIGNAL
IF PREVIOUS RSI < 70 AND CURRENT RSI > 70 ==> SELL SIGNAL
andAlthough using sma to get RSI may tell us the correct buy_signal at some point in time, it may also tell us the wrong sell_signal at some point.And using ewma to get RSI is more secure
goog_data[goog_data['rsi_20']>50].count(axis=0)['rsi_20'] /\
goog_data[goog_data['rsi_20']<=50].count(axis=0)['rsi_20']
n_days_7=7
n_days_14=14
# # close_-1_d — this is the price difference between time t and t-1
goog_data['close_-1_s'] = goog_data['Close'].shift(1)
d = goog_data['close_-1_d'] = goog_data['Close']-goog_data['close_-1_s']
goog_data['closepm'] = ( d+d.abs() )/2 # if d>0: (d+d)/2= d, if d<0, (d+(-d))/2= 0
goog_data['closenm'] = ( -d+d.abs() )/2 # if d>0: (-d+d)/= 0, if d<0, ((-d)+(-d))/2= -d (>0)
for n_days in (n_days_7, n_days_14):
p_ema = goog_data['closepm'].ewm( com = n_days - 1,
min_periods=0, # default 0
adjust=True,
).mean()
n_ema = goog_data['closenm'].ewm( com = n_days - 1,
min_periods=0,
adjust=True,
).mean()
rs_column_name = 'rs_{}'.format(n_days)
rsi_column_name = 'rsi_{}'.format(n_days)
goog_data['p_ema'] = p_ema
goog_data['n_ema'] = n_ema
goog_data[rs_column_name] = rs = p_ema / n_ema
goog_data[rsi_column_name] = 100 - 100 / (1.0 + rs)
goog_data=goog_data.drop(['close_-1_s', 'close_-1_d', 'closepm', 'closenm'], axis=1)
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(15,10) )
ax1 = fig.add_subplot( 211 )
ax1.plot( goog_data.index.values, close_price, color='k', lw=2., label='ClosePrice' )
ax1.set_ylabel( 'Google price in $', fontsize=12 )
ax1.legend()
ax3 = fig.add_subplot( 212 )
ax3.plot( goog_data.index.values, goog_data['rsi_7'], color='b', lw=2., label='rsi_7' )
ax3.plot( goog_data.index.values, goog_data['rsi_14'], color='g', lw=2., label='rsi_14' )
ax3.axhline( y=30, lw=2, color='0.7') # Line for oversold threshold
ax3.axhline( y=50, lw=2, linestyle='--', color='0.8' ) # Neutral RSI
ax3.axhline( y=70, lw=2, color='0.7') # Line for overbought threshold
ax3.set_ylabel( 'RSI', fontsize=12 )
ax3.legend()
from matplotlib.dates import DateFormatter
for ax in(ax1, ax3):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 ) # space between axes
plt.show()
RSI的变动范围在0—100之间,
国内单边做多的股市:强弱指标值一般分布在20—80。
80-100 极强 卖出
50-80 强 买入
20-50 弱 观望
0-20 极弱 买入
国内期货/国际伦敦金/外汇等双向交易市场:强弱指标值一般分布在30-70.
70-100 超买区 做空
30-70 观望慎入区
0-30 超卖区 做多
Standard deviation, which will be referred to as STDEV, is a basic measure of price volatility that is used in combination with a lot of other technical analysis indicators to improve them. We'll explore that in greater detail in this section.
Standard deviation is a standard measure that is computed by measuring the squared deviation of individual prices from the mean price, and then finding the average of all those squared deviation values. This value is known as variance, and the standard deviation is obtained by taking the square root of the variance. Larger STDEVs are
To compute standard deviation, first we compute the variance:
Then, standard deviation is simply the square root of the variance:
SMA : Simple moving average over n time periods.
Let's have a look at the following code, which demonstrates the implementation of standard derivatives.
We are going to import the statistics and the math library we need to perform basic mathematical operations. We are defining the loopback period with the variable time_period , and we will store the past prices in the list history, while we will store the SMA and the standard deviation in sma_values and stddev_values . In the code, we calculate the variance, and then we calculate the standard deviation. To finish, we append to the goog_data data frame that we will use to display the chart:
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
import statistics as stats
import math as math
import matplotlib.ticker as ticker
from matplotlib.dates import DateFormatter
close = goog_data['Close']
time_period = 20 # look back period
history = [] # history of prices
sma_values = [] # to track moving average values for visualization purposes
stddev_values = [] # history of computed stddev values
for close_price in close:
history.append( close_price )
if len(history) >time_period: # we track at most ' time_period' number of prices
del (history[0])
sma = stats.mean(history)
sma_values.append( sma )
variance = 0 # variance is square of standard deviation
for hist_price in history:
variance += ( (hist_price-sma)**2 )
stddev = math.sqrt( variance/len(history) )
stddev_values.append( stddev )
goog_data = goog_data.assign( ClosePrice = pd.Series( close,
index=goog_data.index
)
)
goog_data = goog_data.assign( StandardDeviationOver_20Days = pd.Series( stddev_values,
index=goog_data.index
)
)
close_price = goog_data['ClosePrice']
stddev = goog_data['StandardDeviationOver_20Days']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(10,6) )
ax1 = fig.add_subplot( 211 )
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.set_ylabel('Google price in $', fontsize=12)
ax1.legend()
ax2 = fig.add_subplot( 212 )
ax2.plot( goog_data.index.values, stddev, color='b', lw=2., label='StandardDeviationOver_20Days' )
ax2.axhline( y=stddev.mean(), color='k', ls='--' )
ax2.set_xlabel('Date')
ax2.set_ylabel('Stddev in $')
ax2.legend()
for ax in (ax1, ax2):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 )
plt.show()
From the output, it seems like volatility measure(Standard deviation (STDEV)) ranges from somewhere between $8 over 20 days to $40 over 20 days, with $15 over 20 days being the average.
Here, the standard deviation quantifies the volatility in the price moves during the last 20 days. Volatility spikes when the Google stock prices spike up飙升 or spike down下跌 or go through large changes over the last 20 days. We will revisit the standard deviation as an important volatility measure in later chapters.
time_period = 20 # look back period
goog_data['std_20']= ( goog_data['Close'] ).rolling( window=time_period,
min_periods=1,
).std()
goog_data.head(25)
After using rolling() to shift backward by one line, the first value of std_20 is NaN (missing), which also leads to the following deviation with StandardDeviationOver_20Days
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(10,6) )
ax1 = fig.add_subplot( 211 )
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.set_ylabel('Google price in $', fontsize=12)
ax1.legend()
ax2 = fig.add_subplot( 212 ) ###
ax2.plot( goog_data.index.values, goog_data['std_20'], color='b', lw=2., label='std_20days_volatility' )
ax2.set_xlabel('Date')
ax2.set_ylabel('Stddev in $')
ax2.legend()
for ax in (ax1, ax2):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 )
plt.show()
Momentum, also referred to as MOM, is an important measure of speed and magnitude of price moves. This is often a key indicator of trend/breakout-based trading algorithms.
In its simplest form, momentum is simply the difference between the current price and price of some fixed time periods in the past. Consecutive periods of positive momentum values indicate an uptrend; conversely, if momentum is consecutively negative, that indicates a downtrend. Often, we use simple/exponential moving averages of the MOM indicator, as shown here, to detect sustained trends:
Here, the following applies:
: Price at time t
: Price n time periods before time t
import yfinance as yf
import pandas as pd
start_date = '2014-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data2.pkl'
try:
goog_data2 = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data2 = yf.download( 'goog', start=start_date, end=end_date)
goog_data2.to_pickle( SRC_DATA_FILENAME )
goog_data=goog_data2.tail(620)
Now, let's have a look at the code that demonstrates the implementation of momentum:
time_period = 20 # how far to look back to find reference price to compute momentum
history = [] # history of observed prices to use in momentum calculation
mom_values = [] # track momentum values for visualization purposes
for close_price in close:
history.append( close_price )
if len(history) > time_period: # history is at most 'time_period' number of observations
del (history[0])
mom = close_price - history[0]
mom_values.append( mom )
This maintains a list history of past prices and, at each new observation, computes the momentum to be the difference between the current price and the price time_period days ago, which, in this case, is 20 days:
goog_data = goog_data.assign( ClosePrice=pd.Series( close,
index=goog_data.index
)
)
goog_data = goog_data.assign( MomentumFromPrice_20DaysAgo=pd.Series( mom_values,
index = goog_data.index
)
)
close_price = goog_data['ClosePrice']
mom = goog_data['MomentumFromPrice_20DaysAgo']
import matplotlib.pyplot as plt
fig = plt.figure( figsize=(12,6))
ax1 = fig.add_subplot( 211 )
ax1.set_ylabel('Google price in $')
ax1.plot( goog_data.index.values, close_price, color='g', lw=2., label='ClosePrice' )
ax1.legend()
ax2 = fig.add_subplot( 212 )
ax2.set_ylabel('Momentum in $')
ax2.plot( goog_data.index.values, mom, color='b', lw=2., label='MomentumFromPrice_20DaysAgo')
ax2.legend()
for ax in (ax1, ax2):
ax.xaxis.set_major_locator(ticker.MaxNLocator(12)) # 24%12=0: we need 10 xticklabels and 12 is close to 10
# or plt.autoscale(enable=True, axis='x', tight=True)
ax.autoscale(enable=True, axis='x', tight=True) # move all curves to left(touch y-axis)
ax.margins(0,0.05) # move all curves to up
ax.xaxis.set_major_formatter( DateFormatter('%Y-%m') ) # 2015-08-30 ==> 2015-08
plt.setp( ax.get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.subplots_adjust( hspace=0.3 )
plt.show()
The plot for momentum shows us the following:
In this section, we learned how to create trading signals based on technical analysis. In the next section, we will learn how to implement advanced concepts, such as seasonality, in trading instruments.
In trading, the price we receive is a collection of data points at constant time intervals called time series. They are time dependent and can have increasing or decreasing trends and seasonality trends, in other words, variations specific to a particular time frame. Like any other retail products, financial products follow trends and seasonality during different seasons. There are multiple seasonality effects: weekend, monthly, and holidays.
In this section, we will use the GOOG data from 2001 to 2018 to study price variations
based on the months.
import yfinance as yf
import pandas as pd
import matplotlib.pyplot as plt
start_date = '2001-01-01'
end_date = '2018-01-01'
SRC_DATA_FILENAME = 'goog_data_large.pkl'
try:
goog_data = pd.read_pickle( SRC_DATA_FILENAME )
print( 'File found...reading GOOG data')
except:
print( 'File not found...downloading GOOG data')
goog_data = yf.download( 'goog', start=start_date, end=end_date)
goog_data.to_pickle( SRC_DATA_FILENAME )
goog_monthly_return = goog_data['Adj Close'].pct_change().groupby([
goog_data['Adj Close'].index.year,
goog_data['Adj Close'].index.month,
]).mean()
goog_monthly_return
goog_monthly_return_list = []
for ym_idx in range( len(goog_monthly_return) ):
# goog_monthly_return.index[ym_idx]: (2004, 8) or (2004, 9) or ....
goog_monthly_return_list.append( {'month':goog_monthly_return.index[ym_idx][1],
'monthly_return':goog_monthly_return[goog_monthly_return.index[ym_idx]]
}
)
goog_monthly_return_list = pd.DataFrame( goog_monthly_return_list,
columns=('month','monthly_return')
)
goog_monthly_return_list
goog_monthly_return_list.boxplot( column=['monthly_return'],
by='month', # Column in the DataFrame to pandas.DataFrame.groupby()
figsize=(10,5),
fontsize=12,
)
ax = plt.gca()
labels = [ item.get_text()
for item in ax.get_xticklabels()
]
labels=['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun','Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec']
ax.set_xticklabels( labels )
ax.set_ylabel('GOOG return')
ax.set_title('GOOG Monthly return 2001-2018')
plt.suptitle("")
plt.show()
The preceding code will return the following output. The following screenshot represents the GOOG monthly return:
In this screenshot, we observe that there are repetitive patterns(For example, in September, October, and December, the average return of more than 25% (Q1>0) of the year is positive, and the median of the average return in October is the highest). The month of October is the month when the return seems to be the highest(see median value in the box), unlike November, where we observe a drop in the return.
#################
goog_y_m_return_list = []
for ym_idx in range( len(goog_monthly_return) ):
# goog_monthly_return.index[ym_idx]: (2004, 8) or (2004, 9) or ....
goog_y_m_return_list.append( { 'year':goog_monthly_return.index[ym_idx][0],
'month':goog_monthly_return.index[ym_idx][1],
'monthly_return':goog_monthly_return[goog_monthly_return.index[ym_idx]]
}
)
goog_y_m_return_list = pd.DataFrame( goog_y_m_return_list,
columns=('year','month','monthly_return')
)
goog_y_m_return_list[:17]
plt.figure( figsize=(10,10) )
import seaborn as sns
sns.barplot( x='month', y='monthly_return',hue='year',
linewidth=1, edgecolor='w',
data=goog_y_m_return_list[5:]
)
plt.show()
It can be seen that from 2005 to 2017, the average return of stocks was basically positive for a few months, while the average return for those months was basically negative.
#################
Since it is a time series, we will study the stationary (mean, variance remain constant over time). In the following code, we will check this property because the following time series models work on the assumption that time series are stationary:
Constant mean
Constant variance
Time-independent autocovariance
# Displaying rolling statistics
def plot_rolling_statistics_ts( ts, titletext, ytext, window_size=12 ):
ts.plot( color='red', label='Original', lw=0.5 )
ts.rolling( window_size ).mean().plot( color='blue', label='Rolling Mean' )
ts.rolling( window_size ).std().plot( color='black', label='Rolling Std' )
plt.legend( loc='best' )
plt.ylabel( ytext )
plt.xlabel( 'Date')
plt.setp( plt.gca().get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.title( titletext )
plt.show( block=False )
plot_rolling_statistics_ts( goog_monthly_return[1:],
'GOOG prices rolling mean and standard deviation',
'Monthly return'
)
plot_rolling_statistics_ts( goog_data['Adj Close'],
' GOOG prices rolling mean and standard deviation',
'Daily prices',
365
)
The preceding code will return the following two charts, where we will compare the difference using two different time series.
* One shows the GOOG daily prices, and the other one shows the GOOG monthly return.
* We observe that the rolling average and rolling standard deviation are not constant when using the daily prices instead of using the daily return( The daily return measures the dollar change in a stock’s price as a percentage of the previous day’s closing price. A positive return means the stock has grown in value, while a negative return means it has lost value.A stock with lower positive and negative daily returns is typically less risky than a stock with higher daily returns, which create larger swings in value.).
# the daily historical log returns
plot_rolling_statistics_ts( np.log(goog_data['Adj Close']/goog_data['Adj Close'].shift(1) ),
' GOOG prices rolling mean and standard deviation',
'Daily prices',
)
# the daily historical returns
plot_rolling_statistics_ts( goog_data['Adj Close']/goog_data['Adj Close'].shift(1),
' GOOG prices rolling mean and standard deviation',
'Daily prices',
)
* This means that the first time series representing the daily prices is not stationary. Therefore, we will need to make this time series stationary.
* The non-stationary for a time series can generally be attributed to two factors: trend and seasonality.
The following plot shows GOOG daily prices
When observing the plot of the GOOG daily prices, the following can be stated:
We can see that the price is growing over time; this is a trend.
The wave effect we are observing on the GOOG daily prices comes from seasonality(see previous boxplot).
When we make a time series stationary, we remove the trend and seasonality by modeling and removing them from the initial data.
Once we find a model predicting future values for the data without seasonality and trend, we can apply back the seasonality and trend values to get the actual forecasted data.
The following plot shows the GOOG monthly return:
For the data using the GOOG daily prices, we can just remove the trend by subtracting the moving average from the daily prices in order to obtain the following screenshot:
plot_rolling_statistics_ts( goog_data['Adj Close']-goog_data['Adj Close'].rolling(365).mean(),
'GOOG daily price without trend',
'Daily prices',
365
)
We recommend that you read a book on time series to go deeper in an analysis of the same: Practical Time Series Analysis: Master Time Series Data Processing, Visualization, and Modeling Using Python, Packt edition
3 .To confirm our observation, in the code, we use the popular statistical test: the augmented Dickey-Fuller test:
conda install -c conda-forge statsmodels
statsmodels.tsa.stattools.adfuller(x, maxlag=None, regression='c', autolag='AIC', store=False, regresults=False)https://www.statsmodels.org/dev/generated/statsmodels.tsa.stattools.adfuller.html
Augmented Dickey-Fuller unit root test.
The Augmented Dickey-Fuller test can be used to test for a unit root in a univariate process in the presence of serial correlation.
Returns
ResultStore
, optional
Parameters
autolag {“AIC”, “BIC”, “t-stat”, None}
Method to use when automatically determining the lag length among the values 0, 1, …, maxlag.
If “AIC” (default, Akaike information criterion Akaike information criterion ) or “BIC”( Bayesian information criterionBayesian information criterion), then the number of lags滞后 is chosen to minimize the corresponding information criterion.
https://blog.csdn.net/Linli522362242/article/details/105973507
“t-stat” based choice of maxlag. Starts with maxlag and drops a lag until the t-statistic on the last lag length is significant using a 5%-sized test. https://blog.csdn.net/Linli522362242/article/details/91037961
autolag If None, then the number of included lags is set to maxlag.
from statsmodels.tsa.stattools import adfuller
def test_stationarity( timeseries ):
print( "Results of Dickey-Fuller Test:" )
df_test = adfuller( timeseries[1:], autolag='AIC' )
print(df_test)
df_output = pd.Series( df_test[0:4], index=['Test Statistic',
'p-value',
"#Lags Used",
"Number of Observations Used"
]
)
print( df_output )
test_stationarity( goog_data['Adj Close'])
This test returns a p-value of 0. 996 .Therefore, the time series is not stationary.
4. Let's have a look at the test:
test_stationarity( goog_monthly_return[1:] )
当p-值足够小时,即小于置信水平 ( assuming that the null hypothesis is true, the probability of the test statistic Z in the Rejection Region)时,我们可以拒绝零假设。
This test returns a p-value of less than 0.05 . Therefore, we cannot say that the time series is not stationary. We recommend using daily returns when studying financial products. In the example of stationary, we could observe that no transformation is needed.
test_stationarity( np.log(goog_data['Adj Close']/goog_data['Adj Close'].shift(1)) )
5. The last step of the time series analysis is to forecast the time series. We have two possible scenarios:
The parameter values for AR(p) and MA(q) can be respectively found by using the autocorrelation function (ACF) and the partial偏 autocorrelation function (PACF):
from statsmodels.graphics.tsaplots import plot_acf
from statsmodels.graphics.tsaplots import plot_pacf
import matplotlib.pyplot as plt
from matplotlib import pyplot
plt.figure()
plt.subplot(211)
plot_acf( goog_monthly_return[1:], ax=pyplot.gca(), lags=10 )
# plt.yticks([0,0.25,0.5,0.75,1])
plt.autoscale(enable=True, axis='y', tight=True)
plt.subplot(212)
plot_pacf( goog_monthly_return[1:], ax=pyplot.gca(), lags=10 )
plt.autoscale(enable=True, axis='y', tight=True)
plt.subplots_adjust( hspace=0.5 )
plt.show()
https://www.statsmodels.org/devel/generated/statsmodels.graphics.tsaplots.plot_acf.html?highlight=acfPlot the autocorrelation function
Plots lags on the horizontal and the correlations on vertical axis.
When we observe the two preceding diagrams, we can draw the confidence interval on either side of 0. We will use this confidence interval to determine the parameter values for the AR(p) and MA(q).
6. These two graphs suggest using q=1 and p=1. We will apply the ARIMA model in the following code:Chapter 8 ARIMA models | Forecasting: Principles and Practice (2nd ed)
https://www.statsmodels.org/dev/generated/statsmodels.tsa.arima.model.ARIMA.html
endog array_like, optional
The observed time-series process y.
exog array_like, optional
Array of exogenous regressors.
order tuple, optional
The (p,d,q) order of the model for the autoregressive, differences, and moving average components. d is always an integer, while p and q may either be integers or lists of integers.
from statsmodels.tsa.arima.model import ARIMA
model = ARIMA( goog_monthly_return[1:], order=(2,0,2) )
fitted_results = model.fit()
goog_monthly_return[1:].plot()
fitted_results.fittedvalues.plot( color='red' )
plt.setp( plt.gca().get_xticklabels(), rotation=30, horizontalalignment='right' )
plt.show()
In this chapter, we explored concepts of generating trading signals, such as support and resistance, based on the intuitive ideas of supply and demand that are fundamental forces
that drive market prices. We also briefly explored how you might use support and resistance to implement a simple trading strategy. Then, we looked into a variety of technical analysis indicators, explained the intuition behind them, and implemented and visualized their behavior during different price movements. We also introduced and implemented the ideas behind advanced mathematical approaches, such as Autoregressive (AR), Moving Average (MA), Differentiation (D), AutoCorrelation Function (ACF), and Partial Autocorrelation Function (PACF) for dealing with non-stationary time series datasets. Finally, we briefly introduced an advanced concept such as seasonality, which explains how there are repeating patterns in financial datasets, basic time series analysis and concepts of stationary or non-stationary time series, and how you may model financial data that displays that behavior.
In the next chapter, we will review and implement some simple regression and classification methods and understand the advantages of applying supervised statistical learning methods to trading.