yoavzimmerman
BAN USERMy dynamic programming solution, (in python because I dislike java):
First, define a fibonacci function with memoization so we can get the Nth fibonacci number in O(N).
fib = dict()
def fibonacci(i):
if not fib.get(i):
if i < 2:
fib[i] = 1
else:
fib[i] = fibonacci(i1) + fibonacci(i2)
return fib[i]
Now we can build up the solution from 1 > N using DP. In general for an arbitrary array of integers, if we have an array of E elements, the DP step looks like:
DP(i) = min(1 + DP(i  ej, j1)) for 0 < j < len(E), where ej is the jth element of E.
In this algorithm, E is just the sequence of fibonacci numbers. Since fibonacci number sequence is infinite, we only really need the numbers up to N, or up to i with each step of iteration. The repetition is tricky, since we have to test all possible multiples for ej.
def min_fib(N):
fib_index = 0
DP = dict()
E = set()
DP[0] = 0
for i in range(1, N+1):
# update the list E with all possible fib numbers that could add up to i
while fibonacci(fib_index) <= i:
E.add(fibonacci(fib_index))
fib_index += 1
min_option = None
for e in E:
multiple = e
# since reptitions are allowed, we have to test all multiples
while imultiple >= 0:
option = multiple/e + DP[imultiple]
if not min_option or option < min_option:
min_option = option
# extra optimization step: if there are more multiples then the min_option, it is impossible to achieve a better score and we can just break here
if min_option <= multiple/e:
break
multiple += e
DP[i] = min_option
return DP[N]
Runtime is O(N^3) I believe. Without repetitions allowed it would be O(N^2).
 yoavzimmerman November 03, 2014In O(N) with python
def find_num(L):
counts = {}
for l in L:
if counts.get(l):
counts[l] += 1
else:
counts[l] = 1
for el in counts:
if counts[el] == 1:
return el

yoavzimmerman
November 03, 2014 Good catch, completely missed that problem.
I think you're very close, but I'm still not convinced that there are always a pair of lines that you can remove to make two subsets that satisfy condition #1. What about an extremely dense line segment set such as the following? (excuse the crude drawing, but I think it gets the point across)
i.imgur.(com)/AbhWaQd.png
We could draw try picking a random ray from the origin, then running interval scheduling on Ll for every line segment l that intersects it, then taking the max. I think that breaks the time complexity to O(n^2logn) though.
Thoughts?
EDIT: this is actually wrong, see anantpushkar009's comments for discussion
A key insight is that this problem is the same as unweighted interval scheduling, just around a circle. Consider two line segments l1 and l2 represented by the following points:
l1: (p1, p2)
l2: (p3, p4)
The only way that these line segments intersect is if p2 is "further along" on the unit circle than p3 or p4 is "further along" on the unit circle than p1. An nice way to represent distance on the unit circle is radians, which we can get by taking the inverse tangent of each point! One subtlety is that we need to normalize the tangent to a certain range: I'll use 0 to 2pi. In python:
import math
def arctan(x, y):
atan = math.arctan2(x, y)
while atan < 0:
atan += 2 * math.PI
while atan > 2 * math.PI:
atan = 2 * math.PI
return atan
def transform(L):
transformed = list()
for line in L:
new_point = list()
for x, y in line:
new_point.append(arctan(x, y))
transformed.append(new_point)
return transformed
Now that we have a "transformed" set of line segments, each with a start and finish angle, the problem is identical to the greedy algorithm for maximum interval scheduling. Sort all the line segments by order of finish angle, add each line segment one at a time and remove all conflicting line segments with each iteration. (see an algorithms textbook for a proof):
def no_intersection(L):
T = transform(L)
T = sorted(T, key=lambda segment: segment[1]) # sort by finish angle
result = list()
i = 0
while i < len(T):
result.append(T[i])
# find the next nonconflicting interval
j = i
while j < len(T) and T[j][0] < T[i][1]:
j += 1
i = j
return result
If you wanted the original line segments, you could also store the mappings in a hash table in the transforming step.
Time complexity is O(nlogn) for the sorting step in the greedy algorithm
RepRuthMitchell, Applications Developer at ASAPInfosystemsPvtLtd
My name is Mitchell working as a technical sales support worker in the USA. I identify and establish new business ...
Repstephiegoblin, Research Scientist at Aristocrat Gaming
Je suis Stephie, une rédactrice associée très créative, organisée et expérimentée, à la recherche d'un poste de direction dans ...
Repnnatashanaomi, Android test engineer at 247quickbookshelp
I am Marketing managers who play a crucial role in helping a business to promote and its customers. I manage ...
Recursive solution in python:
 yoavzimmerman November 18, 2014