zr.roman
BAN USER- 0of 0 votes
AnswersGiven a tree.
Each node is of type:public class Node { public IEnumerable<Node> Children { get; set; } }
Write a function GetAllNodes which returns collection, containing all nodes from given tree. (Given only the root node), i.e.:
IEnumerable<Node> allNodes = GetAllNodes( rootNode );
(The task is given in C#, but this is not a restriction, you may use any language you want).
- zr.roman in Russia| Report Duplicate | Flag | PURGE
Software Engineer / Developer Algorithm - 0of 0 votes
AnswersImplement class Queue with help of only 2 stacks.
i.e.:
- zr.roman in Russiaclass Queue<T>{ private Stack<T> _stack1; private Stack<T> _stack2; // public methods: Enqueue, Dequeue, Peek, Count // private methods, if any. }
| Report Duplicate | Flag | PURGE
Software Engineer / Developer Algorithm - 0of 0 votes
AnswerImplement class Stack with help of only 2 queues,
i.e.:
- zr.roman in Russiaclass Stack<T>{ private Queue<T> _queue1; private Queue<T> _queue2; // public methods: Push, Pop, Peek, Count // private methods, if any. }
| Report Duplicate | Flag | PURGE
Software Engineer / Developer Algorithm - 3of 3 votes
AnswersImagine a man reading a book.
- zr.roman in Russia
He can perform only 2 possible actions of reading:
1) read a page in a minute (careful reading),
2) read two pages in a minute (look through).
Nothing else is permitted.
Calculate the number of all possible combinations of book reading ways with given number of pages.
Example: given 3 pages.
Answer is 3 combinations, as follows:
1st: Careful reading (1) - careful reading (1) - careful reading (1),
2nd: Careful reading (1) - look through (2),
3rd: Look through (2) - careful reading (1).| Report Duplicate | Flag | PURGE
SDE-2 Algorithm
thanks for attentive investigation of my code!
Drawbacks of RemoveAt(0) operation can be eliminated by introducing index, which points to the logical first element in ArrayList (improved in code).
This will be O(n) time solution.
Though we get O(n) space complexity against O(k) in case of deque.
O(n) solution without deque.
c# implementation.
Edited after ninhnnsoc's comment.
private static int[] Get1( int[] arr, int k ) {
int t = 0;
int[] res = new int [ arr.Length - k + 1 ];
List<int> list = new List<int> { 0 };
for ( int i = 1; i < arr.Length; i++ ) {
if ( list.Count == 0 || arr[ i ] > arr[ list[ t ] ] ) {
list = new List<int> { i };
t = 0;
}
else
if ( arr[ i ] < arr[ list[ t ] ] ) {
var target = arr[ list[ list.Count - 1 ] ];
if ( arr[ i ] > target ) {
list = new List<int> { list[ 0 ] };
}
list.Add( i );
}
if ( i >= k - 1 ) {
res[ i - k + 1 ] = arr[ list[ t ] ];
if ( arr[ i - k + 1 ] == arr[ list[ t ] ] ) {
t++;
}
}
}
return res;
}
thanks for an interesting case and discussion.
Now I catch your idea.
I was wrong, you were right.
More careful analysis really gives O(n) time complexity due to the idea of tracking possible max vals candidates for each next k-window.
Nice example!
But one observation: actually, deque is not necessary here.
We can use simple ArrayList.
In this case we eliminate inner loops, but though we get RemoveAt(0) operation, which is O(k), but performed rarely, so tighter bound again O(N).
c# implementation.
private static int[] Get1( int[] arr, int k ) {
int[] res = new int [ arr.Length - k + 1 ];
List<int> list = new List<int> { 0 };
for ( int i = 1; i < arr.Length; i++ ) {
if ( list.Count == 0 || arr[ i ] > arr[ list[ 0 ] ] ) {
list = new List<int> { i };
}
if ( arr[ i ] < arr[ list[ 0 ] ] ) {
var target = arr[ list[ list.Count - 1 ] ];
if ( arr[ i ] > target ) {
list.RemoveAt( list.Count - 1 );
}
list.Add( i );
}
if ( i >= k - 1 ) {
res[ i - k + 1 ] = arr[ list[ 0 ] ];
if ( arr[ i - k + 1 ] == arr[ list[ 0 ] ] ) {
list.RemoveAt( 0 ); // here we get O(k) complexity, but in rare cases
// so, overall complexity remains order O(n)
}
}
}
return res;
}
sorry, that was my fault (I replaced "while" by "if", to get ensured in time complexity).
So, solution works, but it is not O(n) solution, It is O(n*k) solution.
Inside the outer "for" loop you have inner "while" loop, which searches for new max val.
In case of input [9,1,3,2,5,4,3,2,1,0] at the 5th iteration the inner "while" loop performs 2 times, this is order k running time (in this case the loop was lucky and found the value in 2 iterations, but in worst case when it is unlucky it will take order k time).
Take another example: [9,1,3,2,5,1,3,2,4,0].
Here the inner loop is performed twice in 5th and 9th "for" loop iterations.
And so on.
Obviously, this is not linear time, because we cannot eliminate the cost of inner loop.
So, asymptotically, taking into account worst case, this is O(n*k) time complexity.
sorry for disturbing!
I guessed it exactly as you describe here and even implemented a solution (posted here).
Misunderstanding was caused by the word "contiGuous". I suppose the correct word for the task is "contiNuous". (according to the idea of the task).
Thanks for examples anyway.
I see the mistake: dp[i] = Math.max(nums[i+k-1], dp[i-1]) - this is not quite true.
Assume, we have array {9,8,7,6,5} and k = 4, so on the 2nd step (i = 1) we'll have nums[i+k-1] = 5 and dp[i-1] = 9, but this condition should lead to recalculating the max val, because the previous max val (9) lays outside the current subarray {8,7,6,5}.
c# implementation.
using System;
namespace MaxContiguousArrays{
class Program {
private static int[] Get( int[] arr, int k ) {
int[] dp = new int [ arr.Length - k + 1 ];
dp[ 0 ] = Math.Max( arr[ k - 1 ], GetMaxInSubArray( arr, 0, k - 1 ) );
for ( int i = 1; i < arr.Length; i++ ) {
var iAhead = i + k - 1;
if ( iAhead >= arr.Length ) { break; }
dp[ i ] = arr[ iAhead ] > dp[ i - 1 ]
? arr[ iAhead ]
: Math.Max( arr[ iAhead ], GetMaxInSubArray( arr, i, k - 1 ) );
}
return dp;
}
private static int GetMaxInSubArray( int[] arr, int start, int k ) {
int maxVal = arr[ start ];
for ( int i = start; i < start + k; i++ ) {
if ( arr[ i ] > maxVal ) {
maxVal = arr[ i ];
}
}
return maxVal;
}
static void Main(string[] args)
{
var res = Get(new int[] { 9,8,7,6,5,4,3,2,1,0 }, 4);
}
}
}
c# implementation.
using System;
namespace Fib {
class Program {
public static Func<int, int> Y( Func<Func<int, int>, Func<int, int>> f ) {
return x => f( Y( f ) )( x );
}
static void Main(string[] args) {
var resFunc = Y( func => {
return new Func<int, int>( i => {
int fib0 = 0;
int fib1 = 1;
int resFib = 1;
for ( int j = 1; j < i; j++ ) {
resFib = fib0 + fib1;
fib0 = fib1;
fib1 = resFib;
}
return resFib;
});
});
var res = resFunc.Invoke( 7 );
Console.WriteLine( res );
}
}
}
c#, overflow is not handled.
string str = "107080023";
Dictionary<char, int> dic = new Dictionary<char, int>(){ {'0', 0}, {'1', 1}, {'2', 2}, {'3', 3}, {'4', 4}, {'5', 5}, {'6', 6}, {'7', 7}, {'8', 8}, {'9', 9}, };
int pos = str.Length - 1;
int res = 0;
int @base = 1;
while( pos >= 0 ) {
res += dic[ str[ pos ] ] * @base;
@base *= 10;
pos--;
}
Console.WriteLine( res );
as far as I get it, the meaning of "integer" here is not from standpoint of datatypes, but from standpoint of floating point (sorry for unintentional pun).
E.g. a number 1.000.000.000.000.000.000.000.000 in terms of task is also an integer, though rather a big one.
c# implementation.
public class BSTBuilder<T> {
public BSTNode<T> BuildTree( IEnumerable<T> collection ) {
var array = collection.ToArray();
Array.Sort( array ); // let's sort just to make sure
var rootNode = _createBSTNode( array, 0, array.Length - 1 );
return rootNode;
}
private static BSTNode<T> _createBSTNode( IReadOnlyList<T> arr, int startPos, int endPos ) {
if ( endPos < startPos ) { return null; }
int midPos = ( startPos + endPos ) / 2;
var bstNode = new BSTNode<T>( arr[ midPos ] );
var leftNode = _createBSTNode( arr, startPos, midPos - 1 );
bstNode.SetLeft( leftNode );
var rightNode = _createBSTNode( arr, midPos + 1, endPos );
bstNode.SetRight( rightNode );
return bstNode;
}
}
c#, straightforward.
public static Node Merge( Node node1, Node node2 ) {
if ( node1 == null ) { return node2; }
if ( node2 == null ) { return node1; }
if ( node1.Data < node2.Data ) { // '<' for asc order, '>' for desc order
return Concat( node1, Merge( node2, node1.Next ) );
}
return Concat( node2, Merge( node1, node2.Next ) );
}
public static Node Concat( Node node1, Node node2 ) {
node1.Next = node2;
return node1;
}
public class Node {
public int Data;
public Node Next;
}
this can be done using hashtable. There may be variations in implementation, but general idea is as follows:
let's have ABCDABC in-order traversal result.
So, we start from position 0 and put all letters one by one into hashtable:
1) {'A', 0}
2) {'B', 1}
3) {'C', 2}
4) {'D', 3}
Then when we'll try to put {'A', 4}, we'll get an exception. This is the signal that we found duplication. So, starting from positions 0 and 4 we obtain duplicate parts and do all the necessary checks (described in solution). If the checks are passed we are done. Otherwise we proceed putting into hashtable remaining characters.
Certainly, this is much faster than O(n^2), but maybe a little slower than O(n). Actually, this is order O(n) or at most O(n logn).
c#.
static private string MakePalindrome( string str ) {
int index = 0;
bool matchFound = true;
for( int i = str.Length - 1; i >= 0; i-- ) {
int k = i; int j = 0;
while( k >= 0 ) {
if( str[ j ] != str[ k ] ) {
matchFound = false; break;
}
k--; j++;
}
if( matchFound ) { index = j; break; }
matchFound = true;
}
var res = new StringBuilder();
foreach ( var ch in str.Substring( index ).Reverse() ) {
res.Append( ch );
}
return res.Append( str ).ToString();
}
c#.
totally generic merge sort except of 1 line ( stressed in code).
static private Number[] MergeSort( Number[] arr ) {
if ( arr.Length == 1 ) { return arr; }
var leftHalf = new Number[ arr.Length / 2 ];
var rightHalf = new Number[ arr.Length - arr.Length / 2 ];
int t = 0;
for ( int i = 0; i < arr.Length; i++ ) {
if ( i < arr.Length / 2 ) { leftHalf[ i ] = arr[ i ]; }
else { rightHalf[ t ] = arr[ i ]; t++; }
}
return Merge( MergeSort( leftHalf ), MergeSort( rightHalf ) );
}
static private Number[] Merge( Number[] arr1, Number[] arr2 ) {
if ( arr1 == null ) { return arr2; }
if ( arr2 == null ) { return arr1; }
Number[] tail = null;
if ( arr1.Length > 1 ) { tail = new Number[ arr1.Length - 1 ]; }
if ( arr1[ 0 ].Value < arr2[ 0 ].Value ) {
for ( int i = 0; i + 1 < arr1.Length; i++ ) {
tail[ i ] = arr1[ i + 1 ];
}
arr1[ 0 ].qaz += arr2.Length; // the only one line added to merge sort
return Concat( arr1[ 0 ], Merge( tail, arr2 ) );
}
tail = null;
if ( arr2.Length > 1 ) { tail = new Number[ arr2.Length - 1 ]; }
for ( int i = 0; i + 1 < arr2.Length; i++ ) {
tail[i] = arr2[ i + 1 ];
}
return Concat( arr2[ 0 ], Merge( arr1, tail ) );
}
private static Number[] Concat( Number firstElm, Number[] arr ) {
var res = new Number[ arr.Length + 1 ];
res[ 0 ] = firstElm;
for ( int i = 0; i < arr.Length; i++ ) {
res[ i + 1 ] = arr[ i ];
}
return res;
}
public struct Number {
public int Value;
public int qaz;
}
brute force approach works here with approximate complexity "log n", so all the tricks are unneeded here.
Proof:
1) Given n = 1.000.
Brute force will find solution in 17 steps.
2) Given n = 1.000.000.
Brute force will find solution in 31 steps.
3) Given n = 1.000.000.000.
Brute force will find solution in 45 steps.
etc.
the cost to log operation is higher that the cost of division and multiplication.
The cost of multiplication is strictly higher than "n log n" (small omega notation should be used here).
So, looks like the complexity of the solution is at least "n log n" (or higher), but n is number of digits in input number.
using hashtable
- zr.roman January 27, 2016