shazam like feature in ios app? - ios

I would like to write an app that allows users to identify songs by putting the mic next to a speaker and listen to the song for a few seconds... so exactly what Shazam does.
Is there any framework or library or service I can use out there to accomplish that in iOS?

You need an API which you can query. An example uf such an API is Gracenote
You could also have a look at Musicbrainz

Yes you can have a look at the echoprint library developed by echonest here
They provide a c++ library to compute the audio fingerprint which can be used under iOS. They also give the ios example!

Related

Video Streaming and Broadcasting using WebRTC

I am very new to Real Time Protocols and I had some questions about how WebRTC works and how I can implement it. I am trying to create a one to many livestream like facebook or periscope, where one user broadcasts and other users join and stream the video. I am using Swift from my client end.
My questions are:
How do I broadcast a video using WebRTC
Is there an SDK for WebRTC in Swift/iOS
I know the questions are very vague but a guidance to the right direction would be great because I am not sure where to start
You will need to use backend servers for that.
If you plan on broadcasting to multiple users directly from your mobile app then stop...
You need to connect your mobile app to a backend media server which then can be used to broadcast the video to a larger audience.
There are several commercial and open source alternatives that enable you to do that. I'd check Red5Pro, Wowza, SwitchRTC, Jitsi, Janus and Kurento for this task.
For the client side, look at react-native-webrtc
You can find more tools for WebRTC developers here.
Regarding your question (2), there's also a SDK for iOS here and a neat get-started-page here (although 2.5ys old, but I haven't found anything better so far yet)

iOS : is it really impossible to get info of current track being played in third-party apps (like Spotify)?

So far, I have red many conflicting answers about this.
In this SO thread, it is said to use:
let player = MPMusicPlayerController.systemMusicPlayer()
if let mediaItem = player.nowPlayingItem {
// ...
}
However, this only works with the iOS player. If the current song is being played by Spotify for example, mediaItem will be nil.
I understand that Apple's policy doesn't allow to access any other application's data. The only thing I am able to do right now is to know if a song is playing from another player with the help of AVAudioPlayer's secondaryAudioShouldBeSilencedHint and isOtherAudioPlaying.
I want to know, however, if there is another way to access it, like using Spotify framework? (I am absolutely non familiar with it, that's just making assumptions).
Thanks for your help.
I am not sure about iOS but the current track can be read from Spotify on a Mac via AppleScript. I use this technique from Objective C. If you're interested I can post the code.
Spotify publish their API for AppleScript here https://developer.spotify.com/applescript-api/
If you're looking for a generic way of determining what is playing then I think that you will be disappointed. Each application will have a different way of retrieving this information.
So yes and no. If you want to specifically only check if Spotify is playing, then perhaps the Spotify iOS SDK provides functionality for such a thing. I really don't know about that SDK's functionality.
I would venture to guess that your actual goal is to see if any third party framework is playing; Pandora, Tidal, Apple Music, Amazon Prime Music, etc. In which case, you would need a framework for each one that provided such functionality.
Apps are sandboxed from each other for security, so yes, there is no way to tell the current track information other than if you have the framework in place and it provides that functionality.

How do I upload video in a tweet in an iOS app using XCode

I am new to iOS development and need to make a change to an iOS app I'm taking over to add video to a tweet. My current app UI allows the user to type in text for a tweet but I would be changing that to allow them to pick a video to upload along with the tweet similar to how the Twitter app works.
I see the Twitter API supports uploading video but I haven't been able to find any good examples on how to accomplish this using XCode and Objective-C. Any recommended approaches or tool kits I can leverage to accomplish this?
https://dev.twitter.com/rest/public/uploading-media
I had to roll my own solution. Check out my project https://github.com/mtrung/TwitterVideoUpload.
Light-weight due to using built-in Apple's Social framework to keep things light. No need to add extra frameworks such as TwitterKit and Fabric.
Support chunk upload.
Built-in support for user's credential retrieval
Thanks for the -1 that was helpful. So thought I found the answer with Fabric (https://get.fabric.io/). The Android side supports image and video upload with a tweet but the iOS side does not (image only). It looks like you have to roll your own solution including building a video picker. Then you can use the Twitter REST API to upload the video. Not exactly what I was hoping for but it is doable.
This link shows Objective C and Swift code to do the video upload Share video on Twitter with Fabric API without composer iOS.

How To Push Music Playable Data Streamed From Spotify To A Device That Does Not Use The SDK Provided By Spotify

I apologise for the possibility of the title of my question would lead to confusion of the problem. For that I will explain my purpose in detail.
We are currently developing our own wifi speaker which is built with MIPS. The speaker comes with an app that will be used to manage it. One of the features that would we would like to include in the app is accessing contents of Spotify and be able to play them on the speakers.
Unfortunately, after going through the iOS SDK Documentation, and did some tests on Web API Console provided by the official of Spotify, I noticed that Spotify does not allow developers to directly get URL of a song, except for preview purposes. I also wasn't able to find any way to get the data bytes of the music streamed from the server. Every content comes with a corresponding URI which is used for a request.
For the device(WiFi Speaker) part, we recently tried to contact Spotify and ask for an SDK that can be used for development. However, one problem is that Spotify told us that they have SDK for x86, and ARMs architecture only. They don't have MIPS.
Now, here are my questions:
Is there any way for me to push music from an app to the WiFi Speakers without having to use SDK (for backend device)?
If Spotify can provide an SDK for our device, then how can we integrate the SDK with our platform?
I'll explain my 2nd question for clarity. Like for instance, in Android and iOS, these are popular platforms and are widely used by mobile devices. So if they provide SDKs for the two OS, then they can use default system frameworks to access the player for playing the content. (In iOS, it's the AVFoundation Framework). However, if Spotify were able to provide the SDK that we need, how would we able to integrate that with our own platform?
I will answer your question no 1:
You should be able to push music from an app using a buffer that you can read from using Core Audio and also forward to a device of your choice. I think what you are looking for can be found at CocoaLibSpotify

Read user's music library within Phonegap

I'm currently developing an app with Phoengap which uses peer-to-peer connection through WebRTC. For my purposes I need to list the sounds available on the user's device.
So I'd like to know if it's currently possible with Phonegap to gain access to the user's music library and e.g. list all available songs sorted by artists? I came across this article from Aurelio de Rosa but I tested it and it doesn't seem to work on iOS.
Any suggestions? Or is there maybe a plugin around which I'm not aware of?
You can find the iOS SDK, Music Library Access example code here. I expect you will need to write a plugin to expose this to Cordova.
Your link should work, but only with music that you store inside your app sandbox or inside the assets (inside www folder).
If you want to use the music library you will need a plugin
I have found one, but it's very old, you will need to update it. It searchs the music and plays it natively too
https://github.com/hutley/HelloPhoneGap1.0/tree/master/HelloPhoneGap1/Plugins/iPod
here you can find a tutorial about how to create a music player using Music Library Access, but it's in japanese. The code is in english http://blog.asial.co.jp/884

Resources