Question

Using Box-Jenkin’s four-step method, forecast the US quarterly GDP for the second quarter of 2015. observation_date...

Using Box-Jenkin’s four-step method, forecast the US quarterly GDP for the second quarter of 2015.

observation_date GDP (billion $)
1947-01-01 1934.5
1947-04-01 1932.3
1947-07-01 1930.3
1947-10-01 1960.7
1948-01-01 1989.5
1948-04-01 2021.9
1948-07-01 2033.2
1948-10-01 2035.3
1949-01-01 2007.5
1949-04-01 2000.8
1949-07-01 2022.8
1949-10-01 2004.7
1950-01-01 2084.6
1950-04-01 2147.6
1950-07-01 2230.4
1950-10-01 2273.4
1951-01-01 2304.5
1951-04-01 2344.5
1951-07-01 2392.8
1951-10-01 2398.1
1952-01-01 2423.5
1952-04-01 2428.5
1952-07-01 2446.1
1952-10-01 2526.4
1953-01-01 2573.4
1953-04-01 2593.5
1953-07-01 2578.9
1953-10-01 2539.8
1954-01-01 2528.0
1954-04-01 2530.7
1954-07-01 2559.4
1954-10-01 2609.3
1955-01-01 2683.8
1955-04-01 2727.5
1955-07-01 2764.1
1955-10-01 2780.8
1956-01-01 2770.0
1956-04-01 2792.9
1956-07-01 2790.6
1956-10-01 2836.2
1957-01-01 2854.5
1957-04-01 2848.2
1957-07-01 2875.9
1957-10-01 2846.4
1958-01-01 2772.7
1958-04-01 2790.9
1958-07-01 2855.5
1958-10-01 2922.3
1959-01-01 2976.6
1959-04-01 3049.0
1959-07-01 3043.1
1959-10-01 3055.1
1960-01-01 3123.2
1960-04-01 3111.3
1960-07-01 3119.1
1960-10-01 3081.3
1961-01-01 3102.3
1961-04-01 3159.9
1961-07-01 3212.6
1961-10-01 3277.7
1962-01-01 3336.8
1962-04-01 3372.7
1962-07-01 3404.8
1962-10-01 3418.0
1963-01-01 3456.1
1963-04-01 3501.1
1963-07-01 3569.5
1963-10-01 3595.0
1964-01-01 3672.7
1964-04-01 3716.4
1964-07-01 3766.9
1964-10-01 3780.2
1965-01-01 3873.5
1965-04-01 3926.4
1965-07-01 4006.2
1965-10-01 4100.6
1966-01-01 4201.9
1966-04-01 4219.1
1966-07-01 4249.2
1966-10-01 4285.6
1967-01-01 4324.9
1967-04-01 4328.7
1967-07-01 4366.1
1967-10-01 4401.2
1968-01-01 4490.6
1968-04-01 4566.4
1968-07-01 4599.3
1968-10-01 4619.8
1969-01-01 4691.6
1969-04-01 4706.7
1969-07-01 4736.1
1969-10-01 4715.5
1970-01-01 4707.1
1970-04-01 4715.4
1970-07-01 4757.2
1970-10-01 4708.3
1971-01-01 4834.3
1971-04-01 4861.9
1971-07-01 4900.0
1971-10-01 4914.3
1972-01-01 5002.4
1972-04-01 5118.3
1972-07-01 5165.4
1972-10-01 5251.2
1973-01-01 5380.5
1973-04-01 5441.5
1973-07-01 5411.9
1973-10-01 5462.4
1974-01-01 5417.0
1974-04-01 5431.3
1974-07-01 5378.7
1974-10-01 5357.2
1975-01-01 5292.4
1975-04-01 5333.2
1975-07-01 5421.4
1975-10-01 5494.4
1976-01-01 5618.5
1976-04-01 5661.0
1976-07-01 5689.8
1976-10-01 5732.5
1977-01-01 5799.2
1977-04-01 5913.0
1977-07-01 6017.6
1977-10-01 6018.2
1978-01-01 6039.2
1978-04-01 6274.0
1978-07-01 6335.3
1978-10-01 6420.3
1979-01-01 6433.0
1979-04-01 6440.8
1979-07-01 6487.1
1979-10-01 6503.9
1980-01-01 6524.9
1980-04-01 6392.6
1980-07-01 6382.9
1980-10-01 6501.2
1981-01-01 6635.7
1981-04-01 6587.3
1981-07-01 6662.9
1981-10-01 6585.1
1982-01-01 6475.0
1982-04-01 6510.2
1982-07-01 6486.8
1982-10-01 6493.1
1983-01-01 6578.2
1983-04-01 6728.3
1983-07-01 6860.0
1983-10-01 7001.5
1984-01-01 7140.6
1984-04-01 7266.0
1984-07-01 7337.5
1984-10-01 7396.0
1985-01-01 7469.5
1985-04-01 7537.9
1985-07-01 7655.2
1985-10-01 7712.6
1986-01-01 7784.1
1986-04-01 7819.8
1986-07-01 7898.6
1986-10-01 7939.5
1987-01-01 7995.0
1987-04-01 8084.7
1987-07-01 8158.0
1987-10-01 8292.7
1988-01-01 8339.3
1988-04-01 8449.5
1988-07-01 8498.3
1988-10-01 8610.9
1989-01-01 8697.7
1989-04-01 8766.1
1989-07-01 8831.5
1989-10-01 8850.2
1990-01-01 8947.1
1990-04-01 8981.7
1990-07-01 8983.9
1990-10-01 8907.4
1991-01-01 8865.6
1991-04-01 8934.4
1991-07-01 8977.3
1991-10-01 9016.4
1992-01-01 9123.0
1992-04-01 9223.5
1992-07-01 9313.2
1992-10-01 9406.5
1993-01-01 9424.1
1993-04-01 9480.1
1993-07-01 9526.3
1993-10-01 9653.5
1994-01-01 9748.2
1994-04-01 9881.4
1994-07-01 9939.7
1994-10-01 10052.5
1995-01-01 10086.9
1995-04-01 10122.1
1995-07-01 10208.8
1995-10-01 10281.2
1996-01-01 10348.7
1996-04-01 10529.4
1996-07-01 10626.8
1996-10-01 10739.1
1997-01-01 10820.9
1997-04-01 10984.2
1997-07-01 11124.0
1997-10-01 11210.3
1998-01-01 11321.2
1998-04-01 11431.0
1998-07-01 11580.6
1998-10-01 11770.7
1999-01-01 11864.7
1999-04-01 11962.5
1999-07-01 12113.1
1999-10-01 12323.3
2000-01-01 12359.1
2000-04-01 12592.5
2000-07-01 12607.7
2000-10-01 12679.3
2001-01-01 12643.3
2001-04-01 12710.3
2001-07-01 12670.1
2001-10-01 12705.3
2002-01-01 12822.3
2002-04-01 12893.0
2002-07-01 12955.8
2002-10-01 12964.0
2003-01-01 13031.2
2003-04-01 13152.1
2003-07-01 13372.4
2003-10-01 13528.7
2004-01-01 13606.5
2004-04-01 13706.2
2004-07-01 13830.8
2004-10-01 13950.4
2005-01-01 14099.1
2005-04-01 14172.7
2005-07-01 14291.8
2005-10-01 14373.4
2006-01-01 14546.1
2006-04-01 14589.6
2006-07-01 14602.6
2006-10-01 14716.9
2007-01-01 14726.0
2007-04-01 14838.7
2007-07-01 14938.5
2007-10-01 14991.8
2008-01-01 14889.5
2008-04-01 14963.4
2008-07-01 14891.6
2008-10-01 14577.0
2009-01-01 14375.0
2009-04-01 14355.6
2009-07-01 14402.5
2009-10-01 14541.9
2010-01-01 14604.8
2010-04-01 14745.9
2010-07-01 14845.5
2010-10-01 14939.0
2011-01-01 14881.3
2011-04-01 14989.6
2011-07-01 15021.1
2011-10-01 15190.3
2012-01-01 15275.0
2012-04-01 15336.7
2012-07-01 15431.3
2012-10-01 15433.7
2013-01-01 15538.4
2013-04-01 15606.6
2013-07-01 15779.9
2013-10-01 15916.2
2014-01-01 15831.7
2014-04-01 16010.4
2014-07-01 16205.6
2014-10-01 16294.7
2015-01-01 15107.3

Homework Answers

Answer #1

I made a program of Box-Jenkins ARIMA model for forecasting GDP for the second quarter of 2015 as below:

rm(list=ls())
data=read.csv(file.choose())
GDP=data[,2]
library(timeSeries)
GDP=ts(GDP,start=c(1947,1),frequency=4)
plot(GDP)
library(tseries)
adf.test(GDP) # for stationarity
GDP1=diff(GDP)
adf.test(GDP1)
#d=1
acf(GDP1)
#q=2
pacf(GDP1)
#p=1
#Apply ARIMA(1,1,2)

fit<-arima(GDP, order=c(1,1,2))
library(forecast)
pred=forecast(fit,h=1)

final output is : ​

Point Forecast Lo 80 Hi 80 Lo 95 Hi 95
2015 Q2 14481.23 14356.27 14606.2 14290.12 14672.35

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Perform a hypothesis testing using Kendall’s Tau method to check if the distribution of peak flows...
Perform a hypothesis testing using Kendall’s Tau method to check if the distribution of peak flows does not change as a function of time. Data is below please show step by step. I will rate afterwards. Year Peak Flow 1947 15500 1948 27400 1949 27500 1950 43000 1951 19100 1952 17600 1953 15800 1954 8600 1955 9950 1956 35400 1957 34700 1958 37000 1959 26000 1960 27000 1961 42300 1962 26300 1963 41600 1964 50500 1965 15500 1966 6710 1967...
Regress Consumption against the GDP from the data sheet. Include the Excel ANOVA table.Although irrelevant run...
Regress Consumption against the GDP from the data sheet. Include the Excel ANOVA table.Although irrelevant run a "F" test as well as individual coefficient test. Write a short paragraph discussing the results. For example, how this information can be used to forecast future consumption or any other interesting conclusions you can draw material in your textbook as well as outside reading can be very helpful.   Year Personal Consumption Expenditure Nominal GDP 1929-01-01 77.4 104.6 1930-01-01 70.1 92.2 1931-01-01 60.7 77.4...
Year Total Tornadoes 1950 213 1951 272 1952 252 1953 434 1954 562 1955 605 1956...
Year Total Tornadoes 1950 213 1951 272 1952 252 1953 434 1954 562 1955 605 1956 516 1957 870 1958 576 1959 616 1960 628 1961 709 1962 669 1963 475 1964 716 1965 909 1966 597 1967 938 1968 672 1969 620 1970 666 1971 901 1972 753 1973 1114 1974 957 1975 931 1976 846 1977 864 1978 801 1979 867 1980 878 1981 794 1982 1059 1983 943 1984 919 1985 696 1986 777 1987 668 1988...
(For this part, you MUST present sufficient solution steps, and MUST apply specific Excel functions =NPV(…),...
(For this part, you MUST present sufficient solution steps, and MUST apply specific Excel functions =NPV(…), =IRR(…), =AVERAGE(…), =YIELD(…) whenever applicable. We are given the information that Microthin’s stock price was $21 in December 2013, $29 in December 2014, $27 in December 2015, $20 in December 2016, and $26 in December 2017. It also pays annual dividend amounts varying from 2013 through 2017. Let's assume you do the following transactions: a) In December 2013: buy 30,000 Microthin shares; b) In...
Do it in python please Boston Americans 1903 No Winner 1904 New York Giants 1905 Chicago...
Do it in python please Boston Americans 1903 No Winner 1904 New York Giants 1905 Chicago White Sox 1906 Chicago Cubs 1907 Chicago Cubs 1908 Pittsburgh Pirates 1909 Philadelphia Athletics 1910 Philadelphia Athletics 1911 Boston Red Sox 1912 Philadelphia Athletics 1913 Boston Braves 1914 Boston Red Sox 1915 Boston Red Sox 1916 Chicago White Sox 1917 Boston Red Sox 1918 Cincinnati Reds 1919 Cleveland Indians 1920 New York Giants 1921 New York Giants 1922 New York Yankees 1923 Washington Senators...
Download the data set “Mine8” from the class D2L site. This data set can be used...
Download the data set “Mine8” from the class D2L site. This data set can be used to test the impact of the 1952 Mine Safety Act on mining fatalities. One of the goals of the legislation was to cut down on the high accident rate in small mines. The authors of the original study hypothesized that mine fatalities were a function of the level of mine technology, average mine size, and mine safety regulation. Consider the regression Ft = β0+...
Open Hurricane data. SETUP: Is it reasonable to assume that hurricanes with higher categories will cause...
Open Hurricane data. SETUP: Is it reasonable to assume that hurricanes with higher categories will cause more death? Given the data, your job is to check if this assertion is indeed reasonable or not. HINT: Read Lecture 24. 19. What would be the correct Null-Hypothesis? a. Data related to two decades should not be related. b. The population averages are equal. c. The slope of the regression line is equal to zero. d. None of these. 20. The P-value is...
Describe what the following query does using lyrics database with mysql. lyrics database is posted below...
Describe what the following query does using lyrics database with mysql. lyrics database is posted below 1.) select m.lastname, m.firstname, s.lastname from members m inner join salespeople s using (salesID) order by m.lastname asc; 2.) select studioID, studioname, base from salespeople sa inner join studios st on (sa.salesID = st.salesid) where base < 300 3.) SELECT artistName FROM Artists WHERE artistID IN (SELECT artistID FROM Titles) DROP TABLES IF EXISTS Artists,Genre, Members, Titles, Tracks,SalesPeople,Studios,XrefArtistsMembers; DROP TABLES IF EXISTS Authors,Publishers,Titles,Title_Authors,Royalties; DROP...
using mysql lyrics.database. i will provide the lyrics schema database info below 1. List the first...
using mysql lyrics.database. i will provide the lyrics schema database info below 1. List the first name, last name, and region of members who do not have an email. 2. List the first name, last name, and region of members who do not have an email and they either have a homephone ending with a 2 or a 3. 3. List the number of track titles that begin with the letter 's' and the average length of these tracks in...
Q1. Use Union statements to show the following: list the number of artists that have a...
Q1. Use Union statements to show the following: list the number of artists that have a webaddress, the number of artists that do not have a webaddress, and the total number of artists. The output should look like: +--------------------------+----------+ | Has webaddress | count(*) | +--------------------------+----------+ | Has webaddress | 5 | | Does not have webaddress | 6 | | Total | 11 | +--------------------------+----------+ Q2. A new salesperson has been hired. Their SalesID is 5, full name is...
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT