PYTHON
Write a python function that will return the total length of line that passes through any number of provided points ( (x,y) ). The points should be passed as individual tuples or lists. The function should also have a parameter (True or False) to indicate whether the line should start from the origin, and that parameter should default to False. If True, the returned value should include the distance from the origin to the first point, otherwise start adding distances from the first point. So a function call could look something like this:
dist = lineLength((1,2), (2,3), (7,4), start=False)
Demonstrate it in your main program by calculating the length of
line going through the following 5 points (with and without the
origin option set to True):
(1,1), (-2,4), (-3,-2), (2,-1), (1,1)
Python3 code:
# function to calculate the line length
def lineLength(*points, start = False):
points = list(points)
if start == True:
points = [(0, 0)] + points
# Calculate the euclidean distance between the consecutive points
dist = 0
for i in range(len(points) - 1):
length = ((points[i+1][0] - points[i][0])**2 + (points[i+1][1] - points[i][1])**2 )**0.5
dist += length
return dist
# main function
def main():
# Testing for two cases
dist1 = lineLength((1,1), (-2,4), (-3,-2), (2,-1), (1,1), start = False)
dist2 = lineLength((1,1), (-2,4), (-3,-2), (2,-1), (1,1), start = True)
# Print the output
print("Length of the line without origin:", dist1)
print("Length of the line with origin:", dist2)
if __name__ == "__main__":
main()
Please refer to the following picture for the source code and sample output:
Get Answers For Free
Most questions answered within 1 hours.