Jump to content

Wikipedia:Reference desk/Archives/Computing/2021 December 3

From Wikipedia, the free encyclopedia
Computing desk
< December 2 << Nov | December | Jan >> December 4 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 3[edit]

Uber distance units[edit]

How can I change the distance units with which the Uber app shows how far away my driver is? Mine used to be kilometres, here in Kyiv, but now they are in miles. I think this changed when I visited Canada and did not revert when I returned to Ukraine, but that might not be accurate.

(A year ago I asked why Uber drivers don't seem to see where I am, even when the app shows me the accurate blue dot. That question is still unresolved.)

Hayttom (talk) 07:17, 3 December 2021 (UTC)[reply]

Hi, Hayttom. The Uber app uses Google Maps APIs. Although I’m just guessing, have you tried changing the settings in your Google Maps app profile? Click the profile icon, settings, and then change distance units to your preference. Let me know if that works or not. Viriditas (talk) 22:02, 3 December 2021 (UTC)[reply]
Hello Viriditas, I'm afraid that didn't work. My Google Maps setting was 'automatic' and I changed it to 'kilometres', but it made no difference. But THANK YOU very much for the suggestion! Hayttom (talk) 09:15, 6 December 2021 (UTC)[reply]
Hayttom, sorry to hear you still have the issue. I don’t know if you are running iOS or Android, but check your date, time, and region settings. Viriditas (talk) 22:25, 6 December 2021 (UTC)[reply]

Is there a way to accomplish the equivalent of getByteFrequencyData for an entire audio track?[edit]

Hello wonderful Wikipedians! So I've been playing around with audio visualization in Javascript. I have code that looks like this.

let audioElement = document.getElementById('source'); let audioCtx = new AudioContext(); let analyser = audioCtx.createAnalyser(); analyser.fftSize = 2048; let source = audioCtx.createMediaElementSource(audioElement); source.connect(analyser); source.connect(audioCtx.destination); var data = new Uint8Array(analyser.frequencyBinCount);

Then I can run analyser.getByteFrequencyData(data);

Which, while the audio is playing, will populate the variable "data" with an array representing frequency data of the audio for the current "frame", so to speak. What I want to be able to do is create an array of such data for every "frame". Or, alternatively, to select data arbitrarily based on timestamp. (In milliseconds, say.)

Currently I can do something like this:

let final_data=[]; let captureFunction=function() { analyser.getByteFrequencyData(data); final_data.push([...data]); } if(!audio_ended) { window.requestAnimationFrame(captureFunction); } }; window.requestAnimationFrame(captureFunction);

This works reasonably well, but I can't technically guarantee the frequency or reliability of requestAnimationFrame firing. How might I accomplish what I'm trying to do? TheRiseOfSkittlez (talk) 12:22, 3 December 2021 (UTC)[reply]

Offhand answer (not having investigated your question properly): I expect you want to use a timer object.
Better, but still rather shallow answer: using OfflineAudioContext instead of AudioContext sounds kind of suitable. You don't actually want to play the audio, right?  Card Zero  (talk) 00:13, 5 December 2021 (UTC)[reply]
Were you looking at this, which is what I get as the first result of a Web search for "JavaScript audio visualizer"? Your code looks similar. If you want to do something like seek through the audio source, you do that with the HTMLMediaElement member functions. If you want to go through the source and build arrays for each frame, that's gonna eat up some memory. Can you describe in more detail the end goal you're trying to accomplish? Are you still just trying to make visualizations, or are you trying to make your code do other stuff? --47.155.96.47 (talk) 02:15, 5 December 2021 (UTC)[reply]
I want to do audio visualization, but not in real time. I want to render it frame-by-frame and snapshot each frame. TheRiseOfSkittlez (talk) 07:05, 5 December 2021 (UTC)[reply]
Sorry for the delay, personal stuff. If you really need to do this, the ideal way is to implement a full JavaScript audio player, that also spits out the visualizations. That way you fetch the audio yourself and control the whole pipeline up to sending stuff to sound output. You can seek around and process the audio at-will since your program is what's handling all of it. The web audio API is intended for the use case where you're letting the browser do the heavy lifting, and you want to plug things into a processing pipeline to do transformations on the audio before it's output to speakers or whatever—things like applying effects to the audio. Now, you could take existing JavaScript players or libraries and try to hook your code into them. If you're just trying to do this for education, what's happening is your intended task has bumped up against the limitations of the environment you're working in. I mean, you can do anything, theoretically, within JavaScript and the browser DOM, but that doesn't mean everything will be simple or painless. If you're interested in doing fancier stuff with your code this might be a good time to move into a more general-purpose programming language. --47.155.96.47 (talk) 07:47, 10 December 2021 (UTC)[reply]