Lagrange Interpolation polynomial using python of degree 3
I understand the math for the question I cannot seem to get the correct python code please help with python to solve
For f (x) = x ln(x), (1) use appropriate Lagrange interpolating polynomial of degree three to approximate f(8.4). Use the following data: f(8.1) = 16.94410, f(8.3) = 17.56492, f(8.6) = 18.50515, f(8.7) = 18.82091. (2) use appropriate Newton interpolation polynomial again to redo the work.
Everything has to be done in python
1. first attempt:
# Python3 program for implementation
# of Lagrange's Interpolation
# To represent a data point corresponding to x and y =
f(x)
class
Data:
def
__init__(
self
, x, y):
self
.x
=
x
self
.y
=
y
# function to interpolate the given data points
# using Lagrange's formula
# xi -> corresponds to the new data point
# whose value is to be obtained
# n -> represents the number of known data
points
def
interpolate(f:
list
, xi:
int
,
n:
int
)
-
>
float
:
# Initialize
result
result
=
0.0
for
i
in
range
(n):
#
Compute individual terms of above formula
term
=
f[i].y
for
j
in
range
(n):
if
j !
=
i:
term
=
term
*
(xi
-
f[j].x)
/
(f[i].x
-
f[j].x)
#
Add current term to result
result
+
=
term
return
result
# Driver Code
if
__name__
=
=
"__main__"
:
# creating an array
of 4 known data points
f
=
[Data(
0
,
2
), Data(
1
,
3
), Data(
2
,
12
), Data(
5
,
147
)]
# Using the
interpolate function to obtain a data point
# corresponding to
x=3
print
(
"Value
of f(3) is :"
, interpolate(f,
3
,
4
))
2. other code:
# Lagrange Interpolation
# Importing NumPy Library
import numpy as np
# Reading number of unknowns
n = int(input('Enter number of data points: '))
# Making numpy array of n & n x n size and initializing
# to zero for storing x and y value along with differences of y
x = np.zeros((n))
y = np.zeros((n))
# Reading data points
print('Enter data for x and y: ')
for i in range(n):
x[i] = float(input( 'x['+str(i)+']='))
y[i] = float(input( 'y['+str(i)+']='))
# Reading interpolation point
xp = float(input('Enter interpolation point: '))
# Set interpolated value initially to zero
yp = 0
# Implementing Lagrange Interpolation
for i in range(n):
p = 1
for j in range(n):
if i != j:
p = p * (xp - x[j])/(x[i] - x[j])
yp = yp + p * y[i]
# Displaying output
print('Interpolated value at %.3f is %.3f.' % (xp, yp))
Get Answers For Free
Most questions answered within 1 hours.