Jump to content

Wikipedia:Reference desk/Archives/Computing/2014 October 2

From Wikipedia, the free encyclopedia
Computing desk
< October 1 << Sep | October | Nov >> October 3 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 2

[edit]

mayanmar telicom industry

[edit]

Can I know what about Burma Telicom Industry.Howmany companees in ther,and facilitees please — Preceding unsigned comment added by 135.245.168.34 (talk) 13:53, 2 October 2014 (UTC)[reply]

The Wikipedia articles Telecommunications in Burma and Internet in Burma are probably the closest we have to answering your question. This recent BBC article discusses the recent boom in telecoms in the country. -- Finlay McWalterTalk 14:11, 2 October 2014 (UTC)[reply]
The BBC is probably the preeminent Western source for information on Myanmar; they regularly cover the nation's economy and political situation in their World Service news reports. Last week, they had an hour-long session on the transition to the use of computers in banks in Myanmar. You can find lots of information on their main portal, Burma in Transition; their Myanmar country profile. Last week's World Service special report is available online, Myanmar and Zimbabwe (in some parts of the world, you might have to wait a few days/weeks for availability). That report provided lots of very informative economic statistics.
Of course, you don't have to use the internet: BBC World Service Shortwave Radio is still broadcast in to Burma - one of the few remaining World Service radios: [1]. World Service transmits from Thailand from 2300GMT on 5875 kHz, which is particularly useful if your internet access is intermittent.
Nimur (talk) 18:37, 2 October 2014 (UTC)[reply]

time and complexity calculation of algoriyhms

[edit]

i m really confused.i m unable to compute the time and space complexities of different functions .please someone suggest me a general way to calculate it.i tried to make a polynomial for many algorithms as heap sort,merge sort,selection sort but i was failed to achieve. Consider the following code

int unknown(int n) {
   int i, j, k = 0;
   for (i  = n/2; i <= n; i++)
       for (j = 2; j <= n; j = j * 2)
           k = k + n/2;
   return k;
}

What is the returned value of the above function in terms of theta(?) and how?

pls someone help me to grasp all these concept i m requesting to all of u..i need your favor and obligations. 22:43, 2 October 2014 (UTC)223.229.116.173 (talk)

I'm not sure exactly what you're asking. Supposing n >= 2, the inner loop should run for lg n steps, there are floor(n/2) + 1 times the inner loop runs, so you should just have the product of that - so, essentially, it's n log n additions. I would be willing to discuss whatever you might be interested in regarding complexity, in general; but please try to clarify and ask specific questions about what you need help with. If you would like to discuss more freely, drop me a line on my talk page and we can do it there (or over email if you'd like). :-)Phoenixia1177 (talk) 05:36, 3 October 2014 (UTC)[reply]
Here's a free online book on the subject, if you don't have one, [2]. There's plenty more I'm sure.Phoenixia1177 (talk) 05:38, 3 October 2014 (UTC)[reply]
I don't believe that there is a general way to calculate these complexities. Take, for example:
 int StevesFunction ( int n )
 {
   int x = 0 ;
   for ( int i = 0 ; i < n ; i++ )
      if ( i > 1000 )
         for ( int j = 0 ; j < n ; j++ )
            x++ ;
      else
         x-- ;
   return x ;
 }
This algorithm is O(n) for n<1000 and O(n2) for n much larger than 1000...and for values not much over 1000, it's...a mess. Automatically calculating the true performance level of an arbitrary piece of code (or even manually determining it by following a formally described method) would be akin to solving the Halting problem - which is known to be impossible. Worse than that is that in practice, these O(xxx) metrics are only somewhat-useful guidelines for what makes a good algorithm. Some techniques that are O(n2) are actually faster than O(n) solutions to the same problem because the absolute time cost of each step is so much higher for the O(n) solution for all useful values of n. Others are crucially dependent on the nature of the data - just take a look at Sorting_algorithm#Comparison_of_algorithms and you'll see that O(n.log(n)) is the best that any algorithm can guarantee for worst-case data - but many of the algorithms can achieve O(n) for best-case or near-best-case data. So if you know something about the data you are feeding to the algorithm (like if it's nearly in the right order already) - then you can do much better than the 'fastest-in-worst-case' sort mechanisms.
So you really have to take these estimations with a grain of salt...they are a very very rough guideline, and no more. SteveBaker (talk) 14:30, 3 October 2014 (UTC)[reply]
Actually, since Big-O is asymptotic complexity, the first 1000 or million values of n don't matter, and the algorithm has always time complexity O(n2). It's true that in principle, big-O can be misleading, especially if you have e.g. a large constant overhead that dominates for "realistic" use cases. That said, I've seen the notion of a "realistic specification" go from 20 formulas to 20 million formulas in my scientific career...and that does mean that O(n3) complexity had better go, even if Intel is doing its best to help out. ;-). --Stephan Schulz (talk) 14:53, 3 October 2014 (UTC)[reply]
To the OP: There is no one method, but there is a tool box (and recurrence relations take up quite a bit of space in this tool box). There is a free Stanford course up at Coursera. --Stephan Schulz (talk) 14:59, 3 October 2014 (UTC)[reply]
Note that the number of digits can be taken to be the input size. That way, above program has exponential running time. See Pseudo-polynomial time. --88.152.132.111 (talk) 17:07, 3 October 2014 (UTC)[reply]