Compare commits

...

203 Commits

Author SHA1 Message Date
Cameron Gutman
214461e123 Version 5.6.1 2017-12-01 00:42:43 -08:00
Cameron Gutman
b0144a3256 Update decoder-errata.txt with HEVC errata 2017-12-01 00:40:32 -08:00
Cameron Gutman
3171256c6e Remove EvdevCaptureProvider components from non-root build 2017-12-01 00:37:25 -08:00
Cameron Gutman
5c69f6716c Don't build evdev_reader for the non-root variant 2017-12-01 00:10:55 -08:00
Cameron Gutman
6264781539 Update common to get decoder compatibility fixes 2017-11-30 23:47:08 -08:00
Cameron Gutman
0225f534d0 Fix H.265 streaming issues with MediaTek Android TV devices 2017-11-29 20:27:33 -08:00
Cameron Gutman
284a31737e Catch input buffer too small 2017-11-28 19:33:34 -08:00
Cameron Gutman
b37a2dea57 Fix help display on some Android TV devices 2017-11-25 15:08:22 -08:00
Cameron Gutman
5c865e7f36 Version 5.6 r4 2017-11-25 14:33:41 -08:00
Cameron Gutman
04d9aea8c8 Detect and report decoder hangs 2017-11-25 14:27:04 -08:00
Cameron Gutman
b6f52db9c3 Fix crash when input events are received and no H.264 decoder is present 2017-11-25 13:35:46 -08:00
Cameron Gutman
99d2e40683 Reset HDR when decoder crashes 3 times in a row 2017-11-25 13:21:04 -08:00
Cameron Gutman
02c4ed2724 Improve decoder crash reporting reliability 2017-11-25 13:19:30 -08:00
Cameron Gutman
5f4aab8f94 Improve decoder crash reporting detail 2017-11-25 12:56:54 -08:00
Cameron Gutman
ec65901003 Report frames rendered in decoder crash report 2017-11-25 11:25:04 -08:00
Cameron Gutman
915acee88d Version 5.6 r3 2017-11-23 11:41:20 -08:00
Cameron Gutman
300d444f71 Ensure inForeground is set before CMS binding can complete 2017-11-23 11:34:22 -08:00
Cameron Gutman
f37ab40c2f Fix race condition between completeOnCreate() and onConfigurationChanged() 2017-11-23 11:25:51 -08:00
Cameron Gutman
16e285d926 Version 5.6 r2 2017-11-21 21:33:06 -08:00
Cameron Gutman
f2d122a275 Fix screen dimensions for portrait devices 2017-11-21 20:18:28 -08:00
Cameron Gutman
bfa5a6349e Ensure MediaCodecHelper is initialized before evaluating codecs 2017-11-21 19:27:08 -08:00
Cameron Gutman
a56689aea3 Always include resolutions that fit on the display 2017-11-21 19:18:41 -08:00
Cameron Gutman
3a5ba820cb Version 5.6 2017-11-20 23:08:43 -08:00
Cameron Gutman
ec69fef36f Ignore back button presses on the default context 2017-11-20 22:46:57 -08:00
Cameron Gutman
ff38074f55 Report GL Renderer in RendererException 2017-11-20 22:38:22 -08:00
Cameron Gutman
85d0ce0c40 Update Gradle to 3.0.1 2017-11-20 22:28:54 -08:00
Cameron Gutman
777129ca90 Move GLRenderer fetching into PcView to avoid race conditions inside Game activity and cache the result 2017-11-20 22:28:19 -08:00
Cameron Gutman
06156c4d68 Ignore back from goodix_fp device 2017-11-20 21:03:36 -08:00
Cameron Gutman
1c725b9dac Don't use reference picture invalidation on low-end Snapdragon SoCs 2017-11-20 20:56:31 -08:00
Cameron Gutman
f761ee52db Exclude resolutions that are not supported by the decoders 2017-11-18 19:47:39 -08:00
Cameron Gutman
05e8cfcc0a Report adaptive playback status in crash reports 2017-11-18 18:31:12 -08:00
Cameron Gutman
912925ef2c Disable performance optimizations when in multi-window 2017-11-18 17:14:40 -08:00
Cameron Gutman
4deb881ec8 Enable adaptive playback on non-Intel devices 2017-11-18 16:37:17 -08:00
Cameron Gutman
f55d6308ce Pass source rect to PiP to smoothly animate to 16:9 2017-11-18 16:29:03 -08:00
Cameron Gutman
44a3a141c0 Submit H.264 CSD in a single blob to try to prevent some decoder crashes 2017-11-18 15:14:25 -08:00
Cameron Gutman
37b5ba004c Fix IDR frame NALU drop race condition 2017-11-18 14:43:04 -08:00
Cameron Gutman
b774b47213 Update for NDK 16 (deprecating MIPS) 2017-11-18 13:38:45 -08:00
Cameron Gutman
74dc00445e Version 5.5 2017-11-10 01:19:23 -08:00
Cameron Gutman
3b4563d5ea Suppress digital trigger events if an analog trigger axis is present. Fixes #465 2017-11-10 00:50:02 -08:00
Cameron Gutman
38669817b4 Update common to fix HEVC artifacts in some apps 2017-11-10 00:10:21 -08:00
Cameron Gutman
8f1d3ae04e Add support for PiP on Oreo 2017-11-09 23:28:22 -08:00
Cameron Gutman
74ed95871b Exclude HDR toggle when the device doesn't support it 2017-11-09 21:57:33 -08:00
Cameron Gutman
cc5d67616c Prevent false USB access prompts due to races with kernel input stack bringup 2017-11-09 21:14:10 -08:00
Cameron Gutman
eed7f09e6f Fix numpad operator keys not working 2017-11-07 22:03:40 -08:00
Cameron Gutman
e3c1d23744 Fix SHIELD remote back button not working 2017-11-07 21:45:07 -08:00
Cameron Gutman
c4b1200b43 Update build tools to 27.0.1 2017-11-07 21:44:27 -08:00
Cameron Gutman
dff09f33a3 Fix shift not working on soft keyboard 2017-11-07 00:27:27 -08:00
Cameron Gutman
1f6b1dc2fe Send different VK codes for left and right ctrl/alt/shift keys. Fixes #318 2017-11-06 23:38:48 -08:00
Cameron Gutman
3f118dae93 Add HDR support and tweak HEVC supported decoders 2017-11-05 19:31:05 -08:00
Cameron Gutman
91a30ff6fe Target O MR1 2017-11-05 15:43:11 -08:00
BryanHaley
5102669b06 Virtual L3 R3 Buttons (#453)
* Added virtual L3 R3 options to better support gamepads missing these buttons.

* Update preferences.xml
2017-11-05 13:57:02 -08:00
Cameron Gutman
2e2f09be00 Fix frame drops when stopping the stream 2017-11-05 13:49:06 -08:00
Cameron Gutman
c402103fe3 Avoid colliding with System UI in multi-window mode 2017-11-05 13:15:06 -08:00
Cameron Gutman
5e5df8abc8 Add never drop frames option for devices with micro-stuttering issues 2017-11-05 12:29:33 -08:00
Cameron Gutman
d125eb7b16 Update to gradle 3.0.0 2017-11-05 12:08:16 -08:00
Cameron Gutman
a116858493 Add .debug suffix to debug builds 2017-11-05 12:07:52 -08:00
Cameron Gutman
5f3b333e98 Version 5.2.1 2017-10-17 00:38:59 -07:00
Cameron Gutman
80a37855c7 Merge branch 'master' of github.com:moonlight-stream/moonlight-android 2017-10-17 00:37:00 -07:00
Cameron Gutman
5db1ec8ec0 Fix support for GFE 3.10 2017-10-17 00:35:36 -07:00
Cameron Gutman
8911c58e50 Block OMX.ffmpeg software decoders 2017-10-17 00:31:26 -07:00
Cameron Gutman
780a64694d Fix NPE when input device is removed during enumeration 2017-10-17 00:07:51 -07:00
Cameron Gutman
3c5ea9c8c3 Remove Nvidia's HEVC decoder from the hard blacklist now that it seems to be fine on Foster NRD90M 2017-10-08 22:06:06 -07:00
Cameron Gutman
40d1436ce3 Update for AS 3.0 Beta 7 2017-10-04 19:30:36 -07:00
Cameron Gutman
dbb02acd37 Reintroduce the 75% HEVC bitrate multiplier that the old streaming core had 2017-09-25 21:39:53 -07:00
Cameron Gutman
20c4eac4ef Force HEVC disabled on Qualcomm SoCs older than Snapdragon 805 2017-09-19 21:21:23 -07:00
Cameron Gutman
b9f1142af7 Version 5.2 2017-09-09 18:53:36 -07:00
Cameron Gutman
38a6a2b74a A few fixes for decoder crash notifications 2017-09-09 18:44:06 -07:00
Cameron Gutman
fd2421618a Update common-c with crash fix 2017-09-09 17:40:53 -07:00
Cameron Gutman
79a9ea7179 Add decoder crash notification and settings reset on continued crashing 2017-09-09 17:40:07 -07:00
Cameron Gutman
34a11c9262 Correct reachability when restoring a lost address 2017-09-09 16:02:39 -07:00
Cameron Gutman
84a9845c1d Fix polling overwriting manually entered IP addresses 2017-09-09 15:40:07 -07:00
Cameron Gutman
5b05220008 Prevent mDNS from overwriting external IP addresses 2017-09-09 15:21:31 -07:00
Cameron Gutman
b2bd7257e1 Fix Lint warnings 2017-09-09 14:12:54 -07:00
Cameron Gutman
46a998c113 Convert address fields to strings to better manage DNS names 2017-09-09 13:39:54 -07:00
Cameron Gutman
60cd951774 Rename localIp/remoteIp fields to localAddress/remoteAddress to prepare for DNS names 2017-09-09 12:47:23 -07:00
Cameron Gutman
d4f8d8f689 Switch database storage to use strings for addresses 2017-09-09 12:43:20 -07:00
Cameron Gutman
608a0ebb5b Update build files for AS3b5 2017-09-09 11:50:42 -07:00
Cameron Gutman
f01a15d182 Removed duplicated current address logic 2017-09-09 11:49:15 -07:00
Cameron Gutman
0268b4f958 Update gradle for AS 3.0b4 2017-09-03 12:52:18 -07:00
Cameron Gutman
d71cf0eb98 Add app category for Oreo 2017-09-02 13:48:45 -07:00
Cameron Gutman
10ab40f823 Add/update remaining assets 2017-09-02 13:48:11 -07:00
Cameron Gutman
427edfa021 Update common submodule 2017-09-01 19:11:49 -07:00
Cameron Gutman
6f18831d5c Update BouncyCastle libs 2017-09-01 18:39:49 -07:00
Cameron Gutman
a3db09f422 Disable input compatibility mode on ChromeOS 2017-09-01 18:07:18 -07:00
Cameron Gutman
d185a05b1d Sort and sync vendor IDs with xpad 2017-08-25 21:04:36 -07:00
Cameron Gutman
78e575504a Update straggling app icon 2017-08-23 23:07:03 -07:00
Cameron Gutman
0a0be19b69 Fix brown-paper-bag bug in audio init error checking 2017-08-22 00:17:03 -07:00
Cameron Gutman
0792157e9d Fix some markdown errors and tweak supported GPUs 2017-08-13 23:53:18 -07:00
madmario1000
cdd0ecf0b7 Update README.md (#400)
Clarify the required specs a bit
2017-08-13 23:49:49 -07:00
Cameron Gutman
1ac721a35b Bump to version 5.1.2 2017-08-13 18:51:22 -07:00
Cameron Gutman
e49b1c92a2 Update for AS 3.0 Beta 2 2017-08-13 18:37:31 -07:00
Cameron Gutman
db4295bf83 Add adaptive icon for PC shortcut 2017-08-13 18:31:09 -07:00
Cameron Gutman
824c37f9d5 Adaptive launcher icon 2017-08-13 18:06:53 -07:00
Cameron Gutman
acf4426952 Update for Gradle 4 2017-06-24 12:56:52 -07:00
Cameron Gutman
e8c50342ab Version 5.1.1 2017-06-17 16:06:57 -07:00
Cameron Gutman
598995de3b Fix audio renderer using non-existant classes on Lollipop 2017-06-16 20:27:03 -07:00
Cameron Gutman
01cf0cc649 Fix Lint error 2017-06-16 20:06:45 -07:00
Cameron Gutman
fa560f462f Add battery saver mode 2017-06-16 20:01:41 -07:00
Cameron Gutman
f6e40118a9 Bring back the warning displayed if video decoder initialization fails 2017-06-16 19:50:50 -07:00
Cameron Gutman
fe7148dbd4 Only throw decoder exceptions if we're still receiving them after 3 seconds 2017-06-16 19:39:15 -07:00
Cameron Gutman
60de065836 Cleanup video decoder teardown paths 2017-06-16 19:11:39 -07:00
Cameron Gutman
6f82f82abb Use low latency audio pathway on Lollipop and later 2017-06-16 19:08:15 -07:00
Cameron Gutman
42f18cb4ac Version 5.1 2017-06-11 17:14:15 -07:00
Cameron Gutman
1bbd0054c2 Update common again to fix another long haul testing bug 2017-06-11 14:56:41 -07:00
Cameron Gutman
acdde37a3a Update common library again 2017-06-11 13:54:05 -07:00
Cameron Gutman
ad40e12167 Update common to fix incorrect assert firing 2017-06-11 13:09:29 -07:00
Cameron Gutman
1b3322b5ee Suppress crashes if the surface has become invalid 2017-06-10 17:25:23 -07:00
Cameron Gutman
6340ec6c6d Consolidate handling of decoder exceptions 2017-06-10 16:57:37 -07:00
Cameron Gutman
babd92c8c0 Add additional information to total frame latency and RendererException 2017-06-10 16:45:07 -07:00
Cameron Gutman
7f1fe5f520 Fix NPE if the device doesn't support H.264 hardware decoding 2017-06-10 11:48:25 -07:00
Cameron Gutman
01458770d2 Fix NPE enumerating input devices 2017-06-10 11:45:12 -07:00
Cameron Gutman
8d05f044f5 Allow software decoding on the emulator for testing 2017-06-08 22:21:51 -07:00
Cameron Gutman
f5680b59a5 Use debug moonlight-common with asserts enabled on debug builds and release moonlight-common with asserts disabled on release builds 2017-06-08 19:57:55 -07:00
Cameron Gutman
0ecf86c7ed At long last, Android has native mouse capture. Don't show the root version to users running O 2017-06-08 18:26:12 -07:00
Cameron Gutman
6789e8d497 Immediately call stopConnection() rather than waiting for activity stop on connection failure 2017-06-08 18:24:22 -07:00
Cameron Gutman
7d0160d556 Update gradle and SDK to O 2017-06-08 18:17:59 -07:00
Cameron Gutman
f6a0990432 Final fixes for Android O pointer capture 2017-06-08 18:17:34 -07:00
Cameron Gutman
5d6094df97 Version 5.0.2 2017-06-08 17:57:57 -07:00
Cameron Gutman
d98d4aeda2 Fix FEC fencepost error in moonlight-common 2017-06-08 17:57:43 -07:00
Cameron Gutman
852dcf5a2d Merge branch 'o-bringup' 2017-06-08 17:30:10 -07:00
Cameron Gutman
82e5aa122d Update common with FEC and latency fixes 2017-06-07 23:17:06 -07:00
Cameron Gutman
fe237d1da3 Fix some exceptions that escaped on decoder shutdown and surface loss 2017-06-07 20:01:09 -07:00
Cameron Gutman
e199fcd2d9 Try allowing decoder exceptions after initial start since we shouldn't throw on stop anymore 2017-06-06 22:50:08 -07:00
Cameron Gutman
d7c6f63592 Force Qualcomm and Samsung HEVC decoders disabled to avoid crashes and poor performance 2017-06-06 22:49:09 -07:00
Cameron Gutman
4b9c6b149a Remove the decoder stop hack and try to workaround the issue differently 2017-06-06 22:48:28 -07:00
Cameron Gutman
d1e41e41a1 Stop the connection in onStop() to try to avoid deadlocks due to surface loss. Also avoid calling stopConnection() from connection listener callbacks due to deadlock risk. 2017-06-05 20:33:23 -07:00
Cameron Gutman
96dfe25a14 Support packet size adjustments on LANs 2017-06-03 11:51:35 -07:00
Cameron Gutman
f76d78607a Improve HEVC decoder compatibility by submitting VPS+SPS+PPS in one CSD blob rather than individually 2017-06-03 11:46:29 -07:00
Cameron Gutman
a96f688bb2 Disable backup of preferences due to the device-specific data contained there 2017-06-02 20:00:37 -07:00
Cameron Gutman
90a1e68c68 Move input capture check to not mask touch events 2017-06-02 18:17:18 -07:00
Cameron Gutman
b287606106 Fix Pixel C keyboard d-pad regression due to aliasing with SOURCE_GAMEPAD 2017-05-31 21:51:32 -07:00
Cameron Gutman
a413185085 Fix Pixel C keyboard d-pad regression due to aliasing with SOURCE_GAMEPAD 2017-05-31 21:51:01 -07:00
Cameron Gutman
aa1b283570 Initial working pointer capture using onClick 2017-05-31 21:46:53 -07:00
Cameron Gutman
f07c886711 Add isCapturing() method to mouse capture providers 2017-05-31 21:26:26 -07:00
Cameron Gutman
e66b1ebec9 Initial pointer capture work for O 2017-05-31 20:50:47 -07:00
Cameron Gutman
d06912e81a Name the spinner threads so they are easily identified 2017-05-31 19:05:25 -07:00
Cameron Gutman
08bcd97594 Use a less power intensive way of keeping the DVFS state friendly 2017-05-29 20:11:39 -07:00
Cameron Gutman
49e51f5f6f 5.0.1 r2 2017-05-21 14:43:53 -07:00
Cameron Gutman
4223a7fd30 Version 5.0.1 2017-05-21 14:27:39 -07:00
Cameron Gutman
6edd0ab540 Only use RFI on modern Intel devices 2017-05-21 14:15:05 -07:00
Cameron Gutman
ce7146175a Merge remote-tracking branch 'origin/new-core' 2017-05-21 14:05:00 -07:00
Cameron Gutman
3176a85f35 Enable RFI for Intel decoders 2017-05-21 14:01:30 -07:00
Cameron Gutman
ad1c11bba5 Decouple direct submit producer and polling consumer 2017-05-21 13:48:02 -07:00
Cameron Gutman
ac640a6842 Fix a few small nits with keyboard and dpad navigation of the UI 2017-05-21 13:24:18 -07:00
Cameron Gutman
8962497a8c Fix deadlocks in audio and video stream shutdown using the new callbacks 2017-05-21 13:07:19 -07:00
Cameron Gutman
83141d3f91 Version 5.0.0 r2 2017-05-18 13:42:48 -07:00
Cameron Gutman
55f2e89bbe Reuse callback buffers 2017-05-18 13:37:02 -07:00
Cameron Gutman
3558655b72 Change submodule remote to use HTTPS link 2017-05-18 11:26:47 -07:00
Cameron Gutman
44cbf8adc1 Fix crash on stream disconnect on Android 7.0+ devices (root only) 2017-05-18 10:52:17 -07:00
Cameron Gutman
686490ba70 Handle decoder exceptions in dequeueInputBuffer 2017-05-18 10:25:48 -07:00
Cameron Gutman
d0ecde1e16 Fix crash if video decoder fails to initialize 2017-05-18 09:58:28 -07:00
Cameron Gutman
9417908848 Fix crash in virtual controller if a release event happens without a press 2017-05-17 21:32:24 -07:00
Cameron Gutman
93b0073467 Finish the activity if the computer wasn't found 2017-05-17 20:51:33 -07:00
Cameron Gutman
1434be262c Make sure a USB context exists before reporting input 2017-05-17 20:38:55 -07:00
Cameron Gutman
75aabd6471 Perform cleanup tasks in onDestroy() to avoid crashing if onStop() is called twice 2017-05-17 20:22:10 -07:00
Cameron Gutman
bafa2addd3 Fix crash queuing input buffer on stop 2017-05-17 20:09:11 -07:00
Cameron Gutman
32b787e77c Eat more decoder exceptions on stop/teardown 2017-05-17 19:45:55 -07:00
Cameron Gutman
43b58b7a5e Exclude Qualcomm's software HEVC decoder which chokes on our streams 2017-05-17 19:41:43 -07:00
Cameron Gutman
9ae1fe2696 Version 5.0.0 2017-05-15 23:49:01 -07:00
Cameron Gutman
6d0f34e2c4 Version 4.8.5 2017-05-15 23:30:30 -07:00
Cameron Gutman
f7d91b5107 Merge remote-tracking branch 'origin/master' into new-core 2017-05-15 23:23:45 -07:00
Cameron Gutman
a3c95480d8 Enable reference frame invalidation for recent Qualcomm and NVIDIA decoders 2017-05-15 23:23:17 -07:00
Cameron Gutman
732311c2a4 Fix codec display after streaming and restore polling behavior of non-direct submit decoders 2017-05-15 21:41:41 -07:00
joeyenfield
043c9a978e Fix issue with ipega controller not capturing keypresses on Samsung phones. (#386) 2017-05-15 18:07:54 -07:00
Cameron Gutman
36b248be4b Fix logging and deadlock on stream termination 2017-05-15 01:06:35 -07:00
Cameron Gutman
8e247ad9a6 Basic streaming working with new-core 2017-05-15 00:31:03 -07:00
Cameron Gutman
a2de98c91a JNI code complete 2017-05-14 23:08:21 -07:00
Cameron Gutman
81d1e615bf Adapt to new-core reworking of moonlight-common (likely buggy) 2017-05-14 17:14:45 -07:00
Cameron Gutman
244fae07ab Update gradle and build tools 2017-05-14 15:11:21 -07:00
Cameron Gutman
e7d96f0ac2 Explicitly set resizeableActivity=true so DeX will let us run in a resizeable window 2017-05-13 10:33:47 -07:00
Cameron Gutman
4555b3c74c Move JNI libraries over to moonlight-common/new-core 2017-05-12 18:57:26 -07:00
Cameron Gutman
8c13186757 Ignore iml files 2017-05-12 18:23:53 -07:00
Cameron Gutman
feafc4ef3c Get build working with AAR moonlight-common 2017-05-12 18:22:28 -07:00
Cameron Gutman
5c03295478 Add moonlight-common submodule 2017-05-12 17:48:33 -07:00
Cameron Gutman
dc3a923041 Bump version to 4.8.4 2017-05-11 23:20:43 -07:00
Cameron Gutman
eccba807bc Update gradle 2017-05-04 23:02:17 -07:00
Cameron Gutman
35fa8f5bcc Fix keyboard arrow keys being sent as gamepad d-pad events 2017-05-04 23:00:47 -07:00
laurentquark
0380910588 Add French language support (cleaned up by me) 2017-05-04 22:41:09 -07:00
Cameron Gutman
e85bb4372e Fix some build warnings and errors with the Dutch translation 2017-05-04 22:33:42 -07:00
Subject
2c345cd6c2 Update: Dutch Translation #1 (#261)
* Halfway through string translation

* Fixed up translation -- Ready for pull request

* Updated Translation to comply with Moonlight update with H265

* 4k & other languages option added, Matched with english strings.xml completely. Ready for pull
2017-05-04 22:30:53 -07:00
Cameron Gutman
b5c96cbb53 Fix manually switching language to Chinese 2017-05-04 22:24:18 -07:00
James Liu
b21ee5ca31 Add Chinese Translation (#345)
* Chinese Translation

I have made a Chinese Translation which contains both Simplified and
Traditional.But I donk't know what the heck is going on.
Now the Simplied one works perfectly but the Tradinional one cannot work
at all,It will turns to English......

* Some Fixes
2017-05-04 22:04:52 -07:00
Cameron Gutman
9c7bff6c75 Merge branch 'master' of git://github.com/Nyaran/moonlight-android into Nyaran-master 2017-05-04 21:43:31 -07:00
Phonedolly
3d470d9aed add korean supports (#338)
* add korean supports

It might have some typos.

* translated one more sentence

* some fixes

"..." was replaced “…” with ellipsis character and some was fixed

* few modifications
2017-05-04 21:35:06 -07:00
Cameron Gutman
b2a36c2c73 Use app context for getting WiFi service to address warnings in new build tools 2017-03-10 22:18:23 -08:00
Cameron Gutman
7978687bfc Update gradle and gradle wrapper 2017-03-10 22:08:09 -08:00
Cameron Gutman
f612ec80e2 Fix active gamepad mask when multi-controller is disabled 2017-02-06 19:26:05 -08:00
Cameron Gutman
7df1a39fcb Update common jar to allow the client to tell the host which controllers are attached 2017-02-04 21:02:11 -08:00
Cameron Gutman
a539ac62ec Version 4.8.3 2017-01-02 19:03:14 -08:00
Cameron Gutman
fa52e5edc2 Remove automatic disabling of back button due to false-positives 2017-01-02 19:02:30 -08:00
Cameron Gutman
3ca681f050 Set isGame to get lower video processing latency on some Android TVs 2017-01-02 18:52:20 -08:00
Cameron Gutman
8086c3d46b Bump version to 4.8.2 2016-12-13 21:28:45 -08:00
Cameron Gutman
928fca843f Update moonlight-common to support GFE 3.2 2016-12-13 21:27:28 -08:00
Cameron Gutman
25d74785d0 Update build tools to 25.0.2 2016-12-13 20:54:24 -08:00
Cameron Gutman
e12a8e7946 Update Gradle to 2.2.3 2016-12-13 20:51:39 -08:00
colin-foster-in-advantage
b14f2ce219 Fixed typo in NAL parser (#311)
Added a missing "()" in the NAL parser script
2016-12-06 09:36:17 -08:00
Cameron Gutman
d31be3d64e Prevent the help activity from reloading across config changes 2016-11-24 11:25:08 -08:00
Cameron Gutman
0704f2aaf6 Set noHistory for the Game activity 2016-11-24 11:23:18 -08:00
Cameron Gutman
832e52ac74 Reload PcView and AppView if the locale changes 2016-11-24 11:22:06 -08:00
Cameron Gutman
f5444551b2 Avoid looping when the thread is trying to be interrupted 2016-11-22 23:20:00 -08:00
Cameron Gutman
3143797b55 Fix transparent background when switching apps in multi-window 2016-11-22 23:18:55 -08:00
Cameron Gutman
cc9b1aeaab Use a MediaCodecInfo object to describe a codec rather than a codec name 2016-11-20 17:56:53 -08:00
Nyaran
f88c9904fb Added support for Spanish language 2016-08-14 11:08:30 +02:00
143 changed files with 3593 additions and 4819 deletions

1
.gitignore vendored
View File

@@ -30,6 +30,7 @@ Thumbs.db
#.idea/workspace.xml - remove # and delete .idea if it better suit your needs.
.gradle
build/
app/app.iml
# Compiled JNI libraries folder
**/jniLibs

6
.gitmodules vendored
View File

@@ -1,3 +1,3 @@
[submodule "app/src/main/jni/jnienet/enet"]
path = app/src/main/jni/jnienet/enet
url = https://github.com/cgutman/enet.git
[submodule "moonlight-common"]
path = moonlight-common
url = https://github.com/moonlight-stream/moonlight-common.git

View File

@@ -20,7 +20,7 @@ function p_h264raw.dissector(buf, pkt, root)
local i = 0
local data_start = -1
while i < buf:len do
while i < buf:len() do
-- Make sure we have a potential start sequence and type
if buf:len() - i < 5 then
-- We need more data

View File

@@ -1,4 +1,4 @@
#Moonlight
# Moonlight
[Moonlight](http://moonlight-stream.com) is an open source implementation of NVIDIA's GameStream, as used by the NVIDIA Shield.
We reverse engineered the Shield streaming software and created a version that can be run on any Android device.
@@ -10,35 +10,35 @@ whether in your own home or over the internet.
Check our [wiki](https://github.com/moonlight-stream/moonlight-docs/wiki) for more detailed information or a troubleshooting guide.
##Features
## Features
* Streams any of your games from your PC to your Android device
* Full gamepad support for MOGA, Xbox 360, PS3, OUYA, and Shield
* Automatically finds GameStream-compatible PCs on your network
##Installation
## Installation
* Download and install Moonlight for Android from
[Google Play](https://play.google.com/store/apps/details?id=com.limelight), [Amazon App Store](http://www.amazon.com/gp/product/B00JK4MFN2), or directly from the [releases page](https://github.com/moonlight-stream/moonlight-android/releases)
* Download [GeForce Experience](http://www.geforce.com/geforce-experience) and install on your Windows PC
##Requirements
## Requirements
* [GameStream compatible](http://shield.nvidia.com/play-pc-games/) computer with GTX 600/700 series GPU
* [GameStream compatible](http://shield.nvidia.com/play-pc-games/) computer with an NVIDIA GeForce GTX 600 series or higher desktop or mobile GPU (GT-series and AMD GPUs not supported)
* Android device running 4.1 (Jelly Bean) or higher
* High-end wireless router (802.11n dual-band recommended)
##Usage
## Usage
* Turn on GameStream in the GFE settings
* If you are connecting from outside the same network, turn on internet
streaming
* When on the same network as your PC, open Moonlight and tap on your PC in the list
* Accept the pairing confirmation on your PC
* Accept the pairing confirmation on your PC and add the PIN if needed
* Tap your PC again to view the list of apps to stream
* Play games!
##Contribute
## Contribute
This project is being actively developed at [XDA Developers](http://forum.xda-developers.com/showthread.php?t=2505510)
@@ -46,13 +46,13 @@ This project is being actively developed at [XDA Developers](http://forum.xda-de
2. Write code
3. Send Pull Requests
##Building
## Building
* Install Android Studio and the Android NDK
* Run git submodule update --init --recursive from within moonlight-android/
* In moonlight-android/, create a file called local.properties. Add an ndk.dir= property to the local.properties file and set it equal to your NDK directory.
* Build the APK using Android Studio
##Authors
## Authors
* [Cameron Gutman](https://github.com/cgutman)
* [Diego Waxemberg](https://github.com/dwaxemberg)

View File

@@ -1,153 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module external.linked.project.id=":app" external.linked.project.path="$MODULE_DIR$" external.root.project.path="$MODULE_DIR$/.." external.system.id="GRADLE" external.system.module.group="moonlight-android" external.system.module.version="unspecified" type="JAVA_MODULE" version="4">
<component name="FacetManager">
<facet type="android-gradle" name="Android-Gradle">
<configuration>
<option name="GRADLE_PROJECT_PATH" value=":app" />
</configuration>
</facet>
<facet type="android" name="Android">
<configuration>
<option name="SELECTED_BUILD_VARIANT" value="nonRootDebug" />
<option name="SELECTED_TEST_ARTIFACT" value="_android_test_" />
<option name="ASSEMBLE_TASK_NAME" value="assembleNonRootDebug" />
<option name="COMPILE_JAVA_TASK_NAME" value="compileNonRootDebugSources" />
<afterSyncTasks>
<task>generateNonRootDebugSources</task>
</afterSyncTasks>
<option name="ALLOW_USER_CONFIGURATION" value="false" />
<option name="MANIFEST_FILE_RELATIVE_PATH" value="/src/main/AndroidManifest.xml" />
<option name="RES_FOLDER_RELATIVE_PATH" value="/src/main/res" />
<option name="RES_FOLDERS_RELATIVE_PATH" value="file://$MODULE_DIR$/src/main/res" />
<option name="ASSETS_FOLDER_RELATIVE_PATH" value="/src/main/assets" />
</configuration>
</facet>
<facet type="native-android-gradle" name="Native-Android-Gradle">
<configuration>
<option name="SELECTED_BUILD_VARIANT" value="nonRootDebug" />
</configuration>
</facet>
</component>
<component name="NewModuleRootManager" LANGUAGE_LEVEL="JDK_1_7" inherit-compiler-output="false">
<output url="file://$MODULE_DIR$/build/intermediates/classes/nonRoot/debug" />
<output-test url="file://$MODULE_DIR$/build/intermediates/classes/test/nonRoot/debug" />
<exclude-output />
<content url="file://$MODULE_DIR$">
<sourceFolder url="file://$MODULE_DIR$/src/main/jni/jnienet/enet" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/jni/jnienet" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/jni/evdev_reader" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/jni/nv_opus_dec" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/r/nonRoot/debug" isTestSource="false" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/aidl/nonRoot/debug" isTestSource="false" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/buildConfig/nonRoot/debug" isTestSource="false" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/rs/nonRoot/debug" isTestSource="false" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/apt/nonRoot/debug" isTestSource="false" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/res/rs/nonRoot/debug" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/res/resValues/nonRoot/debug" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/res" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/resources" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/assets" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/aidl" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/rs" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRootDebug/shaders" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/r/androidTest/nonRoot/debug" isTestSource="true" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/aidl/androidTest/nonRoot/debug" isTestSource="true" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/buildConfig/androidTest/nonRoot/debug" isTestSource="true" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/rs/androidTest/nonRoot/debug" isTestSource="true" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/source/apt/androidTest/nonRoot/debug" isTestSource="true" generated="true" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/res/rs/androidTest/nonRoot/debug" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/build/generated/res/resValues/androidTest/nonRoot/debug" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRootDebug/shaders" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/res" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/resources" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/assets" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/aidl" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/rs" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/nonRoot/shaders" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testNonRoot/shaders" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTestNonRoot/shaders" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/res" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/resources" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/assets" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/aidl" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/rs" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/debug/shaders" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/testDebug/shaders" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/main/res" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/main/resources" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/main/assets" type="java-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/main/aidl" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/rs" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/main/shaders" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/androidTest/shaders" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/test/res" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/test/resources" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/test/assets" type="java-test-resource" />
<sourceFolder url="file://$MODULE_DIR$/src/test/aidl" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/test/java" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/test/rs" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/src/test/shaders" isTestSource="true" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/assets" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/blame" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/classes" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/dependency-cache" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/incremental" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/incremental-safeguard" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/jniLibs" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/manifests" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/ndkBuild" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/pre-dexed" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/res" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/rs" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/shaders" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/symbols" />
<excludeFolder url="file://$MODULE_DIR$/build/intermediates/transforms" />
<excludeFolder url="file://$MODULE_DIR$/build/outputs" />
<excludeFolder url="file://$MODULE_DIR$/build/tmp" />
</content>
<orderEntry type="jdk" jdkName="Android API 25 Platform" jdkType="Android SDK" />
<orderEntry type="sourceFolder" forTests="false" />
<orderEntry type="library" exported="" name="bcprov-jdk15on-1.52" level="project" />
<orderEntry type="library" exported="" name="bcpkix-jdk15on-1.52" level="project" />
<orderEntry type="library" exported="" name="tinyrtsp" level="project" />
<orderEntry type="library" exported="" name="limelight-common" level="project" />
<orderEntry type="library" exported="" name="jmdns-3.4.2" level="project" />
<orderEntry type="library" exported="" name="okhttp-2.4.0" level="project" />
<orderEntry type="library" exported="" name="jcodec-0.1.9-patched" level="project" />
<orderEntry type="library" exported="" name="okio-1.5.0" level="project" />
</component>
</module>

View File

@@ -4,30 +4,44 @@ import org.apache.tools.ant.taskdefs.condition.Os
apply plugin: 'com.android.application'
android {
compileSdkVersion 25
buildToolsVersion "25.0.0"
compileSdkVersion 27
buildToolsVersion '27.0.1'
defaultConfig {
minSdkVersion 16
targetSdkVersion 25
targetSdkVersion 27
versionName "4.8.1"
versionCode = 113
versionName "5.6.1"
versionCode = 139
}
flavorDimensions "root"
productFlavors {
root {
applicationId "com.limelight.root"
ndk {
abiFilters "armeabi-v7a", "arm64-v8a", "x86", "x86_64", "mips", "mips64"
// Android O has native mouse capture, so don't show the rooted
// version to devices running O on the Play Store.
maxSdkVersion 25
externalNativeBuild {
ndkBuild {
arguments "PRODUCT_FLAVOR=root"
}
}
applicationId "com.limelight.root"
dimension "root"
}
nonRoot {
applicationId "com.limelight"
ndk {
abiFilters "armeabi-v7a", "arm64-v8a", "x86", "x86_64", "mips", "mips64"
externalNativeBuild {
ndkBuild {
arguments "PRODUCT_FLAVOR=nonRoot"
}
}
applicationId "com.limelight"
dimension "root"
}
}
@@ -36,19 +50,15 @@ android {
}
buildTypes {
debug {
applicationIdSuffix ".debug"
}
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.txt'
proguardFiles getDefaultProguardFile('proguard-android.txt')
}
}
// These lines are required to avoid dexing issues with the BouncyCastle library
// bundled with limelight-common.jar
packagingOptions {
exclude 'META-INF/BCKEY.SF'
exclude 'META-INF/BCKEY.DSA'
}
externalNativeBuild {
ndkBuild {
path "src/main/jni/Android.mk"
@@ -57,14 +67,9 @@ android {
}
dependencies {
compile group: 'org.bouncycastle', name: 'bcprov-jdk15on', version: '1.52'
compile group: 'org.bouncycastle', name: 'bcpkix-jdk15on', version: '1.52'
implementation 'org.bouncycastle:bcprov-jdk15on:1.57'
implementation 'org.bouncycastle:bcpkix-jdk15on:1.57'
implementation files('libs/jcodec-0.1.9-patched.jar')
compile group: 'com.squareup.okhttp', name: 'okhttp', version:'2.4.0'
compile group: 'com.squareup.okio', name:'okio', version:'1.5.0'
compile files('libs/jmdns-3.4.2.jar')
compile files('libs/limelight-common.jar')
compile files('libs/tinyrtsp.jar')
compile files('libs/jcodec-0.1.9-patched.jar')
implementation project(':moonlight-common')
}

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -24,10 +24,19 @@
android:name="android.software.leanback"
android:required="false" />
<!-- Disable legacy input emulation on ChromeOS -->
<uses-feature
android:name="android.hardware.type.pc"
android:required="false"/>
<application
android:allowBackup="true"
android:fullBackupContent="@xml/backup_rules"
android:isGame="true"
android:banner="@drawable/atv_banner"
android:icon="@drawable/ic_launcher"
android:appCategory="game"
android:icon="@mipmap/ic_launcher"
android:roundIcon="@mipmap/ic_launcher"
android:theme="@style/AppTheme">
<!-- Samsung multi-window support -->
@@ -39,9 +48,13 @@
android:name="com.sec.android.support.multiwindow"
android:value="true" />
<!-- Samsung DeX support requires explicit placement of android:resizeableActivity="true"
in each activity even though it is implied by targeting API 24+ -->
<activity
android:name=".PcView"
android:configChanges="mcc|mnc|locale|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
android:resizeableActivity="true"
android:configChanges="mcc|mnc|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
@@ -55,6 +68,7 @@
<activity
android:name=".AppViewShortcutTrampoline"
android:noHistory="true"
android:resizeableActivity="true"
android:configChanges="mcc|mnc|locale|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
@@ -62,13 +76,15 @@
</activity>
<activity
android:name=".AppView"
android:configChanges="mcc|mnc|locale|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
android:resizeableActivity="true"
android:configChanges="mcc|mnc|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
android:value="com.limelight.PcView" />
</activity>
<activity
android:name=".preferences.StreamSettings"
android:resizeableActivity="true"
android:label="Streaming Settings">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
@@ -76,6 +92,7 @@
</activity>
<activity
android:name=".preferences.AddComputerManually"
android:resizeableActivity="true"
android:label="Add Computer Manually">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
@@ -85,6 +102,11 @@
android:name=".Game"
android:configChanges="mcc|mnc|locale|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection"
android:screenOrientation="sensorLandscape"
android:noHistory="true"
android:supportsPictureInPicture="true"
android:resizeableActivity="true"
android:launchMode="singleTask"
android:excludeFromRecents="true"
android:theme="@style/StreamTheme">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
@@ -101,7 +123,14 @@
android:name=".binding.input.driver.UsbDriverService"
android:label="Usb Driver Service" />
<activity android:name=".HelpActivity"></activity>
<activity
android:name=".HelpActivity"
android:resizeableActivity="true"
android:configChanges="mcc|mnc|touchscreen|keyboard|keyboardHidden|navigation|screenLayout|fontScale|uiMode|orientation|screenSize|smallestScreenSize|layoutDirection">
<meta-data
android:name="android.support.PARENT_ACTIVITY"
android:value="com.limelight.PcView" />
</activity>
</application>
</manifest>

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

View File

@@ -2,7 +2,6 @@ package com.limelight;
import java.io.StringReader;
import java.util.List;
import java.util.Locale;
import java.util.UUID;
import com.limelight.computers.ComputerManagerListener;
@@ -27,7 +26,6 @@ import android.app.Service;
import android.content.ComponentName;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.res.Configuration;
import android.os.Bundle;
import android.os.IBinder;
import android.view.ContextMenu;
@@ -81,6 +79,10 @@ public class AppView extends Activity implements AdapterFragmentCallbacks {
// Get the computer object
computer = managerBinder.getComputer(UUID.fromString(uuidString));
if (computer == null) {
finish();
return;
}
try {
appGridAdapter = new AppGridAdapter(AppView.this,
@@ -233,14 +235,13 @@ public class AppView extends Activity implements AdapterFragmentCallbacks {
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Assume we're in the foreground when created to avoid a race
// between binding to CMS and onResume()
inForeground = true;
shortcutHelper = new ShortcutHelper(this);
String locale = PreferenceConfiguration.readPreferences(this).language;
if (!locale.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(getResources().getConfiguration());
config.locale = new Locale(locale);
getResources().updateConfiguration(config, getResources().getDisplayMetrics());
}
UiHelper.setLocale(this);
setContentView(R.layout.activity_app_view);
@@ -251,7 +252,7 @@ public class AppView extends Activity implements AdapterFragmentCallbacks {
String computerName = getIntent().getStringExtra(NAME_EXTRA);
String labelText = getResources().getString(R.string.title_applist)+" "+computerName;
TextView label = (TextView) findViewById(R.id.appListText);
TextView label = findViewById(R.id.appListText);
setTitle(labelText);
label.setText(labelText);
@@ -303,6 +304,9 @@ public class AppView extends Activity implements AdapterFragmentCallbacks {
protected void onResume() {
super.onResume();
// Display a decoder crash notification if we've returned after a crash
UiHelper.showDecoderCrashDialog(this);
inForeground = true;
startComputerUpdates();
}

View File

@@ -91,7 +91,7 @@ public class AppViewShortcutTrampoline extends Activity {
// If a game is running, we'll make the stream the top level activity
if (details.runningGameId != 0) {
intentStack.add(ServerHelper.createStartIntent(AppViewShortcutTrampoline.this,
new NvApp("app", details.runningGameId), details, managerBinder));
new NvApp("app", details.runningGameId, false), details, managerBinder));
}
// Now start the activities

View File

@@ -10,32 +10,37 @@ import com.limelight.binding.input.TouchContext;
import com.limelight.binding.input.driver.UsbDriverService;
import com.limelight.binding.input.evdev.EvdevListener;
import com.limelight.binding.input.virtual_controller.VirtualController;
import com.limelight.binding.video.EnhancedDecoderRenderer;
import com.limelight.binding.video.CrashListener;
import com.limelight.binding.video.MediaCodecDecoderRenderer;
import com.limelight.binding.video.MediaCodecHelper;
import com.limelight.nvstream.NvConnection;
import com.limelight.nvstream.NvConnectionListener;
import com.limelight.nvstream.StreamConfiguration;
import com.limelight.nvstream.av.video.VideoDecoderRenderer;
import com.limelight.nvstream.http.NvApp;
import com.limelight.nvstream.input.KeyboardPacket;
import com.limelight.nvstream.input.MouseButtonPacket;
import com.limelight.nvstream.jni.MoonBridge;
import com.limelight.preferences.GlPreferences;
import com.limelight.preferences.PreferenceConfiguration;
import com.limelight.ui.GameGestures;
import com.limelight.ui.StreamView;
import com.limelight.utils.Dialog;
import com.limelight.utils.ShortcutHelper;
import com.limelight.utils.SpinnerDialog;
import com.limelight.utils.UiHelper;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.Activity;
import android.app.PictureInPictureParams;
import android.app.Service;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.res.Configuration;
import android.content.SharedPreferences;
import android.graphics.Point;
import android.graphics.Rect;
import android.hardware.input.InputManager;
import android.media.AudioManager;
import android.net.ConnectivityManager;
@@ -45,6 +50,7 @@ import android.os.Bundle;
import android.os.Handler;
import android.os.IBinder;
import android.os.SystemClock;
import android.util.Rational;
import android.view.Display;
import android.view.InputDevice;
import android.view.KeyEvent;
@@ -60,8 +66,6 @@ import android.widget.FrameLayout;
import android.view.inputmethod.InputMethodManager;
import android.widget.Toast;
import java.util.Locale;
public class Game extends Activity implements SurfaceHolder.Callback,
OnGenericMotionListener, OnTouchListener, NvConnectionListener, EvdevListener,
@@ -82,9 +86,9 @@ public class Game extends Activity implements SurfaceHolder.Callback,
private ControllerHandler controllerHandler;
private VirtualController virtualController;
private KeyboardTranslator keybTranslator;
private PreferenceConfiguration prefConfig;
private SharedPreferences tombstonePrefs;
private NvConnection conn;
private SpinnerDialog spinner;
@@ -100,12 +104,11 @@ public class Game extends Activity implements SurfaceHolder.Callback,
private ShortcutHelper shortcutHelper;
private EnhancedDecoderRenderer decoderRenderer;
private MediaCodecDecoderRenderer decoderRenderer;
private boolean reportedCrash;
private WifiManager.WifiLock wifiLock;
private int drFlags = 0;
private boolean connectedToUsbDriverService = false;
private ServiceConnection usbDriverServiceConnection = new ServiceConnection() {
@Override
@@ -128,19 +131,13 @@ public class Game extends Activity implements SurfaceHolder.Callback,
public static final String EXTRA_STREAMING_REMOTE = "Remote";
public static final String EXTRA_PC_UUID = "UUID";
public static final String EXTRA_PC_NAME = "PcName";
public static final String EXTRA_APP_HDR = "HDR";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
shortcutHelper = new ShortcutHelper(this);
String locale = PreferenceConfiguration.readPreferences(this).language;
if (!locale.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(getResources().getConfiguration());
config.locale = new Locale(locale);
getResources().updateConfiguration(config, getResources().getDisplayMetrics());
}
UiHelper.setLocale(this);
// We don't want a title bar
requestWindowFeature(Window.FEATURE_NO_TITLE);
@@ -176,21 +173,36 @@ public class Game extends Activity implements SurfaceHolder.Callback,
// Read the stream preferences
prefConfig = PreferenceConfiguration.readPreferences(this);
tombstonePrefs = Game.this.getSharedPreferences("DecoderTombstone", 0);
if (prefConfig.stretchVideo) {
drFlags |= VideoDecoderRenderer.FLAG_FILL_SCREEN;
}
// Listen for events on the game surface
streamView = (StreamView) findViewById(R.id.surfaceView);
streamView = findViewById(R.id.surfaceView);
streamView.setOnGenericMotionListener(this);
streamView.setOnTouchListener(this);
inputCaptureProvider = InputCaptureManager.getInputCaptureProvider(this, this);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
// The view must be focusable for pointer capture to work.
streamView.setFocusable(true);
streamView.setDefaultFocusHighlightEnabled(false);
streamView.setOnCapturedPointerListener(new View.OnCapturedPointerListener() {
@Override
public boolean onCapturedPointer(View view, MotionEvent motionEvent) {
return handleMotionEvent(motionEvent);
}
});
}
// Warn the user if they're on a metered connection
checkDataConnection();
ConnectivityManager connMgr = (ConnectivityManager) getSystemService(Context.CONNECTIVITY_SERVICE);
if (connMgr.isActiveNetworkMetered()) {
displayTransientMessage(getResources().getString(R.string.conn_metered));
}
// Make sure Wi-Fi is fully powered up
WifiManager wifiMgr = (WifiManager) getSystemService(Context.WIFI_SERVICE);
WifiManager wifiMgr = (WifiManager) getApplicationContext().getSystemService(Context.WIFI_SERVICE);
wifiLock = wifiMgr.createWifiLock(WifiManager.WIFI_MODE_FULL_HIGH_PERF, "Limelight");
wifiLock.setReferenceCounted(false);
wifiLock.acquire();
@@ -202,6 +214,7 @@ public class Game extends Activity implements SurfaceHolder.Callback,
boolean remote = Game.this.getIntent().getBooleanExtra(EXTRA_STREAMING_REMOTE, false);
String uuid = Game.this.getIntent().getStringExtra(EXTRA_PC_UUID);
String pcName = Game.this.getIntent().getStringExtra(EXTRA_PC_NAME);
boolean willStreamHdr = Game.this.getIntent().getBooleanExtra(EXTRA_APP_HDR, false);
if (appId == StreamConfiguration.INVALID_APP_ID) {
finish();
@@ -209,51 +222,98 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
// Add a launcher shortcut for this PC (forced, since this is user interaction)
shortcutHelper = new ShortcutHelper(this);
shortcutHelper.createAppViewShortcut(uuid, pcName, uuid, true);
shortcutHelper.reportShortcutUsed(uuid);
// Initialize the MediaCodec helper before creating the decoder
MediaCodecHelper.initializeWithContext(this);
GlPreferences glPrefs = GlPreferences.readPreferences(this);
MediaCodecHelper.initialize(this, glPrefs.glRenderer);
decoderRenderer = new MediaCodecDecoderRenderer(prefConfig.videoFormat);
// Check if the user has enabled HDR
if (prefConfig.enableHdr) {
// Check if the app supports it
if (!willStreamHdr) {
Toast.makeText(this, "This game does not support HDR10", Toast.LENGTH_SHORT).show();
}
// It does, so start our HDR checklist
else if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
// We already know the app supports HDR if willStreamHdr is set.
Display display = getWindowManager().getDefaultDisplay();
Display.HdrCapabilities hdrCaps = display.getHdrCapabilities();
// We must now ensure our display is compatible with HDR10
boolean foundHdr10 = false;
for (int hdrType : hdrCaps.getSupportedHdrTypes()) {
if (hdrType == Display.HdrCapabilities.HDR_TYPE_HDR10) {
LimeLog.info("Display supports HDR10");
foundHdr10 = true;
}
}
if (!foundHdr10) {
// Nope, no HDR for us :(
willStreamHdr = false;
Toast.makeText(this, "Display does not support HDR10", Toast.LENGTH_LONG).show();
}
}
else {
Toast.makeText(this, "HDR requires Android 7.0 or later", Toast.LENGTH_LONG).show();
willStreamHdr = false;
}
}
else {
willStreamHdr = false;
}
decoderRenderer = new MediaCodecDecoderRenderer(prefConfig,
new CrashListener() {
@Override
public void notifyCrash(Exception e) {
// The MediaCodec instance is going down due to a crash
// let's tell the user something when they open the app again
// We must use commit because the app will crash when we return from this function
tombstonePrefs.edit().putInt("CrashCount", tombstonePrefs.getInt("CrashCount", 0) + 1).commit();
reportedCrash = true;
}
},
tombstonePrefs.getInt("CrashCount", 0),
connMgr.isActiveNetworkMetered(),
willStreamHdr,
glPrefs.glRenderer
);
// Don't stream HDR if the decoder can't support it
if (willStreamHdr && !decoderRenderer.isHevcMain10Hdr10Supported()) {
willStreamHdr = false;
Toast.makeText(this, "Decoder does not support HEVC Main10HDR10", Toast.LENGTH_LONG).show();
}
// Display a message to the user if H.265 was forced on but we still didn't find a decoder
if (prefConfig.videoFormat == PreferenceConfiguration.FORCE_H265_ON && !decoderRenderer.isHevcSupported()) {
Toast.makeText(this, "No H.265 decoder found.\nFalling back to H.264.", Toast.LENGTH_LONG).show();
}
if (!decoderRenderer.isAvcSupported()) {
if (spinner != null) {
spinner.dismiss();
spinner = null;
}
// If we can't find an AVC decoder, we can't proceed
Dialog.displayDialog(this, getResources().getString(R.string.conn_error_title),
"This device or ROM doesn't support hardware accelerated H.264 playback.", true);
return;
}
StreamConfiguration config = new StreamConfiguration.Builder()
.setResolution(prefConfig.width, prefConfig.height)
.setRefreshRate(prefConfig.fps)
.setApp(new NvApp(appName, appId))
.setApp(new NvApp(appName, appId, willStreamHdr))
.setBitrate(prefConfig.bitrate * 1000)
.setEnableSops(prefConfig.enableSops)
.enableAdaptiveResolution((decoderRenderer.getCapabilities() &
VideoDecoderRenderer.CAPABILITY_ADAPTIVE_RESOLUTION) != 0)
.enableLocalAudioPlayback(prefConfig.playHostAudio)
.setMaxPacketSize(remote ? 1024 : 1292)
.setRemote(remote)
.setHevcBitratePercentageMultiplier(75)
.setHevcSupported(decoderRenderer.isHevcSupported())
.setEnableHdr(willStreamHdr)
.setAudioConfiguration(prefConfig.enable51Surround ?
StreamConfiguration.AUDIO_CONFIGURATION_5_1 :
StreamConfiguration.AUDIO_CONFIGURATION_STEREO)
MoonBridge.AUDIO_CONFIGURATION_51_SURROUND :
MoonBridge.AUDIO_CONFIGURATION_STEREO)
.build();
// Initialize the connection
conn = new NvConnection(host, uniqueId, Game.this, config, PlatformBinding.getCryptoProvider(this));
keybTranslator = new KeyboardTranslator(conn);
conn = new NvConnection(host, uniqueId, config, PlatformBinding.getCryptoProvider(this));
controllerHandler = new ControllerHandler(this, conn, this, prefConfig.multiController, prefConfig.deadzonePercentage);
InputManager inputManager = (InputManager) getSystemService(Context.INPUT_SERVICE);
@@ -275,12 +335,10 @@ public class Game extends Activity implements SurfaceHolder.Callback,
getWindow().setSustainedPerformanceMode(true);
}
inputCaptureProvider = InputCaptureManager.getInputCaptureProvider(this, this);
if (prefConfig.onscreenController) {
// create virtual onscreen controller
virtualController = new VirtualController(conn,
(FrameLayout)findViewById(R.id.surfaceView).getParent(),
(FrameLayout)streamView.getParent(),
this);
virtualController.refreshLayout();
}
@@ -291,10 +349,53 @@ public class Game extends Activity implements SurfaceHolder.Callback,
usbDriverServiceConnection, Service.BIND_AUTO_CREATE);
}
if (!decoderRenderer.isAvcSupported()) {
if (spinner != null) {
spinner.dismiss();
spinner = null;
}
// If we can't find an AVC decoder, we can't proceed
Dialog.displayDialog(this, getResources().getString(R.string.conn_error_title),
"This device or ROM doesn't support hardware accelerated H.264 playback.", true);
return;
}
// The connection will be started when the surface gets created
streamView.getHolder().addCallback(this);
}
@Override
public void onUserLeaveHint() {
super.onUserLeaveHint();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
if (prefConfig.enablePip && connected) {
enterPictureInPictureMode(
new PictureInPictureParams.Builder()
.setAspectRatio(new Rational(prefConfig.width, prefConfig.height))
.setSourceRectHint(new Rect(
streamView.getLeft(), streamView.getTop(),
streamView.getRight(), streamView.getBottom()))
.build());
}
}
}
@Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
// Capture is lost when focus is lost, so it must be requested again
// when focus is regained.
if (inputCaptureProvider.isCapturingEnabled() && hasFocus) {
// Recapture the pointer if focus was regained
streamView.requestPointerCapture();
}
}
}
private void prepareDisplayForRendering() {
Display display = getWindowManager().getDefaultDisplay();
WindowManager.LayoutParams windowLayoutParams = getWindow().getAttributes();
@@ -384,20 +485,18 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
}
private void checkDataConnection()
{
ConnectivityManager mgr = (ConnectivityManager) getSystemService(Context.CONNECTIVITY_SERVICE);
if (mgr.isActiveNetworkMetered()) {
displayTransientMessage(getResources().getString(R.string.conn_metered));
}
}
@SuppressLint("InlinedApi")
private final Runnable hideSystemUi = new Runnable() {
@Override
public void run() {
// In multi-window mode on N+, we need to drop our layout flags or we'll
// be drawing underneath the system UI.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N && isInMultiWindowMode()) {
Game.this.getWindow().getDecorView().setSystemUiVisibility(
View.SYSTEM_UI_FLAG_LAYOUT_STABLE);
}
// Use immersive mode on 4.4+ or standard low profile on previous builds
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT) {
else if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
Game.this.getWindow().getDecorView().setSystemUiVisibility(
View.SYSTEM_UI_FLAG_LAYOUT_STABLE |
View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION |
@@ -423,11 +522,36 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
@Override
protected void onStop() {
super.onStop();
@TargetApi(Build.VERSION_CODES.N)
public void onMultiWindowModeChanged(boolean isInMultiWindowMode) {
super.onMultiWindowModeChanged(isInMultiWindowMode);
SpinnerDialog.closeDialogs(this);
Dialog.closeDialogs();
// In multi-window, we don't want to use the full-screen layout
// flag. It will cause us to collide with the system UI.
// This function will also be called for PiP so we can cover
// that case here too.
if (isInMultiWindowMode) {
getWindow().clearFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
// Disable performance optimizations for foreground
getWindow().setSustainedPerformanceMode(false);
decoderRenderer.notifyVideoBackground();
}
else {
getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
// Enable performance optimizations for foreground
getWindow().setSustainedPerformanceMode(true);
decoderRenderer.notifyVideoForeground();
}
// Correct the system UI visibility flags
hideSystemUi(50);
}
@Override
protected void onDestroy() {
super.onDestroy();
if (controllerHandler != null) {
InputManager inputManager = (InputManager) getSystemService(Context.INPUT_SERVICE);
@@ -441,8 +565,19 @@ public class Game extends Activity implements SurfaceHolder.Callback,
unbindService(usbDriverServiceConnection);
}
// Destroy the capture provider
inputCaptureProvider.destroy();
}
@Override
protected void onStop() {
super.onStop();
SpinnerDialog.closeDialogs(this);
Dialog.closeDialogs();
if (conn != null) {
VideoDecoderRenderer.VideoFormat videoFormat = conn.getActiveVideoFormat();
int videoFormat = decoderRenderer.getActiveVideoFormat();
displayedFailureDialog = true;
stopConnection();
@@ -461,11 +596,14 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
// Add the video codec to the post-stream toast
if (message != null && videoFormat != VideoDecoderRenderer.VideoFormat.Unknown) {
if (videoFormat == VideoDecoderRenderer.VideoFormat.H265) {
if (message != null) {
if (videoFormat == MoonBridge.VIDEO_FORMAT_H265_MAIN10) {
message += " [H.265 HDR]";
}
else if (videoFormat == MoonBridge.VIDEO_FORMAT_H265) {
message += " [H.265]";
}
else {
else if (videoFormat == MoonBridge.VIDEO_FORMAT_H264) {
message += " [H.264]";
}
}
@@ -473,6 +611,14 @@ public class Game extends Activity implements SurfaceHolder.Callback,
if (message != null) {
Toast.makeText(this, message, Toast.LENGTH_LONG).show();
}
// Clear the tombstone count if we terminated normally
if (!reportedCrash && tombstonePrefs.getInt("CrashCount", 0) != 0) {
tombstonePrefs.edit()
.putInt("CrashCount", 0)
.putInt("LastNotifiedCrashCount", 0)
.apply();
}
}
finish();
@@ -493,19 +639,19 @@ public class Game extends Activity implements SurfaceHolder.Callback,
};
// Returns true if the key stroke was consumed
private boolean handleSpecialKeys(short translatedKey, boolean down) {
private boolean handleSpecialKeys(int androidKeyCode, boolean down) {
int modifierMask = 0;
// Mask off the high byte
translatedKey &= 0xff;
if (translatedKey == KeyboardTranslator.VK_CONTROL) {
if (androidKeyCode == KeyEvent.KEYCODE_CTRL_LEFT ||
androidKeyCode == KeyEvent.KEYCODE_CTRL_RIGHT) {
modifierMask = KeyboardPacket.MODIFIER_CTRL;
}
else if (translatedKey == KeyboardTranslator.VK_SHIFT) {
else if (androidKeyCode == KeyEvent.KEYCODE_SHIFT_LEFT ||
androidKeyCode == KeyEvent.KEYCODE_SHIFT_RIGHT) {
modifierMask = KeyboardPacket.MODIFIER_SHIFT;
}
else if (translatedKey == KeyboardTranslator.VK_ALT) {
else if (androidKeyCode == KeyEvent.KEYCODE_ALT_LEFT ||
androidKeyCode == KeyEvent.KEYCODE_ALT_RIGHT) {
modifierMask = KeyboardPacket.MODIFIER_ALT;
}
@@ -517,7 +663,7 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
// Check if Ctrl+Shift+Z is pressed
if (translatedKey == KeyboardTranslator.VK_Z &&
if (androidKeyCode == KeyEvent.KEYCODE_Z &&
(modifierFlags & (KeyboardPacket.MODIFIER_CTRL | KeyboardPacket.MODIFIER_SHIFT)) ==
(KeyboardPacket.MODIFIER_CTRL | KeyboardPacket.MODIFIER_SHIFT))
{
@@ -578,17 +724,27 @@ public class Game extends Activity implements SurfaceHolder.Callback,
return super.onKeyDown(keyCode, event);
}
// Try the controller handler first
boolean handled = controllerHandler.handleButtonDown(event);
boolean handled = false;
boolean detectedGamepad = event.getDevice() != null && ((event.getDevice().getSources() & InputDevice.SOURCE_JOYSTICK) == InputDevice.SOURCE_JOYSTICK ||
(event.getDevice().getSources() & InputDevice.SOURCE_GAMEPAD) == InputDevice.SOURCE_GAMEPAD);
if (detectedGamepad || (event.getDevice() == null ||
event.getDevice().getKeyboardType() != InputDevice.KEYBOARD_TYPE_ALPHABETIC
)) {
// Always try the controller handler first, unless it's an alphanumeric keyboard device.
// Otherwise, controller handler will eat keyboard d-pad events.
handled = controllerHandler.handleButtonDown(event);
}
if (!handled) {
// Try the keyboard handler
short translated = keybTranslator.translate(event.getKeyCode());
short translated = KeyboardTranslator.translate(event.getKeyCode());
if (translated == 0) {
return super.onKeyDown(keyCode, event);
}
// Let this method take duplicate key down events
if (handleSpecialKeys(translated, true)) {
if (handleSpecialKeys(keyCode, true)) {
return true;
}
@@ -597,8 +753,7 @@ public class Game extends Activity implements SurfaceHolder.Callback,
return super.onKeyDown(keyCode, event);
}
keybTranslator.sendKeyDown(translated,
getModifierState(event));
conn.sendKeyboardInput(translated, KeyboardPacket.KEY_DOWN, getModifierState(event));
}
return true;
@@ -611,16 +766,25 @@ public class Game extends Activity implements SurfaceHolder.Callback,
return super.onKeyUp(keyCode, event);
}
// Try the controller handler first
boolean handled = controllerHandler.handleButtonUp(event);
boolean handled = false;
boolean detectedGamepad = event.getDevice() != null && ((event.getDevice().getSources() & InputDevice.SOURCE_JOYSTICK) == InputDevice.SOURCE_JOYSTICK ||
(event.getDevice().getSources() & InputDevice.SOURCE_GAMEPAD) == InputDevice.SOURCE_GAMEPAD);
if (detectedGamepad || (event.getDevice() == null ||
event.getDevice().getKeyboardType() != InputDevice.KEYBOARD_TYPE_ALPHABETIC
)) {
// Always try the controller handler first, unless it's an alphanumeric keyboard device.
// Otherwise, controller handler will eat keyboard d-pad events.
handled = controllerHandler.handleButtonUp(event);
}
if (!handled) {
// Try the keyboard handler
short translated = keybTranslator.translate(event.getKeyCode());
short translated = KeyboardTranslator.translate(event.getKeyCode());
if (translated == 0) {
return super.onKeyUp(keyCode, event);
}
if (handleSpecialKeys(translated, false)) {
if (handleSpecialKeys(keyCode, false)) {
return true;
}
@@ -629,8 +793,7 @@ public class Game extends Activity implements SurfaceHolder.Callback,
return super.onKeyUp(keyCode, event);
}
keybTranslator.sendKeyUp(translated,
getModifierState(event));
conn.sendKeyboardInput(translated, KeyboardPacket.KEY_UP, getModifierState(event));
}
return true;
@@ -665,7 +828,8 @@ public class Game extends Activity implements SurfaceHolder.Callback,
return true;
}
}
else if ((event.getSource() & InputDevice.SOURCE_CLASS_POINTER) != 0)
else if ((event.getSource() & InputDevice.SOURCE_CLASS_POINTER) != 0 ||
event.getSource() == InputDevice.SOURCE_MOUSE_RELATIVE)
{
// This case is for mice
if (event.getSource() == InputDevice.SOURCE_MOUSE ||
@@ -674,6 +838,11 @@ public class Game extends Activity implements SurfaceHolder.Callback,
{
int changedButtons = event.getButtonState() ^ lastButtonState;
// Ignore mouse input if we're not capturing from our input source
if (!inputCaptureProvider.isCapturingActive()) {
return false;
}
if (event.getActionMasked() == MotionEvent.ACTION_SCROLL) {
// Send the vertical scroll packet
byte vScrollClicks = (byte) event.getAxisValue(MotionEvent.AXIS_VSCROLL);
@@ -872,51 +1041,72 @@ public class Game extends Activity implements SurfaceHolder.Callback,
}
@Override
public void stageStarting(Stage stage) {
public void stageStarting(String stage) {
if (spinner != null) {
spinner.setMessage(getResources().getString(R.string.conn_starting)+" "+stage.getName());
spinner.setMessage(getResources().getString(R.string.conn_starting)+" "+stage);
}
}
@Override
public void stageComplete(Stage stage) {
public void stageComplete(String stage) {
}
private void stopConnection() {
if (connecting || connected) {
connecting = connected = false;
conn.stop();
// Stop may take a few hundred ms to do some network I/O to tell
// the server we're going away and clean up. Let it run in a separate
// thread to keep things smooth for the UI. Inside moonlight-common,
// we prevent another thread from starting a connection before and
// during the process of stopping this one.
new Thread() {
public void run() {
conn.stop();
}
}.start();
}
// Enable cursor visibility again
inputCaptureProvider.disableCapture();
// Destroy the capture provider
inputCaptureProvider.destroy();
}
@Override
public void stageFailed(Stage stage) {
public void stageFailed(String stage, long errorCode) {
if (spinner != null) {
spinner.dismiss();
spinner = null;
}
// Enable cursor visibility again
inputCaptureProvider.disableCapture();
if (!displayedFailureDialog) {
displayedFailureDialog = true;
stopConnection();
LimeLog.severe(stage+" failed: "+errorCode);
// If video initialization failed and the surface is still valid, display extra information for the user
if (stage.contains("video") && streamView.getHolder().getSurface().isValid()) {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Game.this, "Video decoder failed to initialize. Your device may not support the selected resolution.", Toast.LENGTH_LONG).show();
}
});
}
Dialog.displayDialog(this, getResources().getString(R.string.conn_error_title),
getResources().getString(R.string.conn_error_msg)+" "+stage.getName(), true);
getResources().getString(R.string.conn_error_msg)+" "+stage, true);
}
}
@Override
public void connectionTerminated(Exception e) {
public void connectionTerminated(long errorCode) {
// Enable cursor visibility again
inputCaptureProvider.disableCapture();
if (!displayedFailureDialog) {
displayedFailureDialog = true;
e.printStackTrace();
LimeLog.severe("Connection terminated: "+errorCode);
stopConnection();
Dialog.displayDialog(this, getResources().getString(R.string.conn_terminated_title),
getResources().getString(R.string.conn_terminated_msg), true);
}
@@ -976,22 +1166,17 @@ public class Game extends Activity implements SurfaceHolder.Callback,
if (!connected && !connecting) {
connecting = true;
conn.start(PlatformBinding.getDeviceName(), holder, drFlags,
PlatformBinding.getAudioRenderer(), decoderRenderer);
decoderRenderer.setRenderTarget(holder);
conn.start(PlatformBinding.getAudioRenderer(), decoderRenderer, Game.this);
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (connected) {
// HACK: Android is supposed to let you return from this function
// before throwing a fit if you access the surface again. Unfortunately,
// MediaCodec often tries to access the destroyed surface and triggers
// an IllegalStateException. To workaround this, we will invoke
// the DecoderRenderer's stop function ourselves, so it will hopefully
// happen early enough to not trigger the bug
decoderRenderer.stop();
// Let the decoder know immediately that the surface is gone
decoderRenderer.prepareForStop();
if (connected) {
stopConnection();
}
}
@@ -1036,17 +1221,18 @@ public class Game extends Activity implements SurfaceHolder.Callback,
@Override
public void keyboardEvent(boolean buttonDown, short keyCode) {
short keyMap = keybTranslator.translate(keyCode);
short keyMap = KeyboardTranslator.translate(keyCode);
if (keyMap != 0) {
if (handleSpecialKeys(keyMap, buttonDown)) {
// handleSpecialKeys() takes the Android keycode
if (handleSpecialKeys(keyCode, buttonDown)) {
return;
}
if (buttonDown) {
keybTranslator.sendKeyDown(keyMap, getModifierState());
conn.sendKeyboardInput(keyMap, KeyboardPacket.KEY_DOWN, getModifierState());
}
else {
keybTranslator.sendKeyUp(keyMap, getModifierState());
conn.sendKeyboardInput(keyMap, KeyboardPacket.KEY_UP, getModifierState());
}
}
}

View File

@@ -51,14 +51,8 @@ public class HelpActivity extends Activity {
@Override
public boolean shouldOverrideUrlLoading(WebView view, String url) {
if (url.toUpperCase().startsWith("https://github.com/moonlight-stream/moonlight-docs/wiki/".toUpperCase()) ||
url.toUpperCase().startsWith("http://github.com/moonlight-stream/moonlight-docs/wiki/".toUpperCase())) {
// Allow navigation to Moonlight docs
return false;
}
else {
return true;
}
return !(url.toUpperCase().startsWith("https://github.com/moonlight-stream/moonlight-docs/wiki/".toUpperCase()) ||
url.toUpperCase().startsWith("http://github.com/moonlight-stream/moonlight-docs/wiki/".toUpperCase()));
}
});

View File

@@ -2,9 +2,7 @@ package com.limelight;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.Locale;
import com.limelight.binding.PlatformBinding;
import com.limelight.binding.crypto.AndroidCryptoProvider;
@@ -18,6 +16,7 @@ import com.limelight.nvstream.http.PairingManager;
import com.limelight.nvstream.http.PairingManager.PairState;
import com.limelight.nvstream.wol.WakeOnLanSender;
import com.limelight.preferences.AddComputerManually;
import com.limelight.preferences.GlPreferences;
import com.limelight.preferences.PreferenceConfiguration;
import com.limelight.preferences.StreamSettings;
import com.limelight.ui.AdapterFragment;
@@ -34,6 +33,8 @@ import android.content.ComponentName;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.res.Configuration;
import android.opengl.GLSurfaceView;
import android.os.Build;
import android.os.Bundle;
import android.os.IBinder;
import android.preference.PreferenceManager;
@@ -51,12 +52,15 @@ import android.widget.RelativeLayout;
import android.widget.Toast;
import android.widget.AdapterView.AdapterContextMenuInfo;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
public class PcView extends Activity implements AdapterFragmentCallbacks {
private RelativeLayout noPcFoundLayout;
private PcGridAdapter pcGridAdapter;
private ShortcutHelper shortcutHelper;
private ComputerManagerService.ComputerManagerBinder managerBinder;
private boolean freezeUpdates, runningPolling, inForeground;
private boolean freezeUpdates, runningPolling, inForeground, completeOnCreateCalled;
private final ServiceConnection serviceConnection = new ServiceConnection() {
public void onServiceConnected(ComponentName className, IBinder binder) {
final ComputerManagerService.ComputerManagerBinder localBinder =
@@ -86,11 +90,19 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
}
};
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
// Reinitialize views just in case orientation changed
initializeViews();
// Only reinitialize views if completeOnCreate() was called
// before this callback. If it was not, completeOnCreate() will
// handle initializing views with the config change accounted for.
// This is not prone to races because both callbacks are invoked
// in the main thread.
if (completeOnCreateCalled) {
// Reinitialize views just in case orientation changed
initializeViews();
}
}
private final static int APP_LIST_ID = 1;
@@ -110,9 +122,9 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
PreferenceManager.setDefaultValues(this, R.xml.preferences, false);
// Setup the list view
ImageButton settingsButton = (ImageButton) findViewById(R.id.settingsButton);
ImageButton addComputerButton = (ImageButton) findViewById(R.id.manuallyAddPc);
ImageButton helpButton = (ImageButton) findViewById(R.id.helpButton);
ImageButton settingsButton = findViewById(R.id.settingsButton);
ImageButton addComputerButton = findViewById(R.id.manuallyAddPc);
ImageButton helpButton = findViewById(R.id.helpButton);
settingsButton.setOnClickListener(new OnClickListener() {
@Override
@@ -138,7 +150,7 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
.replace(R.id.pcFragmentContainer, new AdapterFragment())
.commitAllowingStateLoss();
noPcFoundLayout = (RelativeLayout) findViewById(R.id.no_pc_found_layout);
noPcFoundLayout = findViewById(R.id.no_pc_found_layout);
if (pcGridAdapter.getCount() == 0) {
noPcFoundLayout.setVisibility(View.VISIBLE);
}
@@ -152,14 +164,55 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Assume we're in the foreground when created to avoid a race
// between binding to CMS and onResume()
inForeground = true;
// Create a GLSurfaceView to fetch GLRenderer unless we have
// a cached result already.
final GlPreferences glPrefs = GlPreferences.readPreferences(this);
if (!glPrefs.savedFingerprint.equals(Build.FINGERPRINT) || glPrefs.glRenderer.isEmpty()) {
GLSurfaceView surfaceView = new GLSurfaceView(this);
surfaceView.setRenderer(new GLSurfaceView.Renderer() {
@Override
public void onSurfaceCreated(GL10 gl10, EGLConfig eglConfig) {
// Save the GLRenderer string so we don't need to do this next time
glPrefs.glRenderer = gl10.glGetString(GL10.GL_RENDERER);
glPrefs.savedFingerprint = Build.FINGERPRINT;
glPrefs.writePreferences();
LimeLog.info("Fetched GL Renderer: " + glPrefs.glRenderer);
runOnUiThread(new Runnable() {
@Override
public void run() {
completeOnCreate();
}
});
}
@Override
public void onSurfaceChanged(GL10 gl10, int i, int i1) {
}
@Override
public void onDrawFrame(GL10 gl10) {
}
});
setContentView(surfaceView);
}
else {
LimeLog.info("Cached GL Renderer: " + glPrefs.glRenderer);
completeOnCreate();
}
}
private void completeOnCreate() {
completeOnCreateCalled = true;
shortcutHelper = new ShortcutHelper(this);
String locale = PreferenceConfiguration.readPreferences(this).language;
if (!locale.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(getResources().getConfiguration());
config.locale = new Locale(locale);
getResources().updateConfiguration(config, getResources().getDisplayMetrics());
}
UiHelper.setLocale(this);
// Bind to the computer manager service
bindService(new Intent(PcView.this, ComputerManagerService.class), serviceConnection,
@@ -225,6 +278,9 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
protected void onResume() {
super.onResume();
// Display a decoder crash notification if we've returned after a crash
UiHelper.showDecoderCrashDialog(this);
inForeground = true;
startComputerUpdates();
}
@@ -311,19 +367,7 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
// Stop updates and wait while pairing
stopComputerUpdates(true);
InetAddress addr;
if (computer.reachability == ComputerDetails.Reachability.LOCAL) {
addr = computer.localIp;
}
else if (computer.reachability == ComputerDetails.Reachability.REMOTE) {
addr = computer.remoteIp;
}
else {
LimeLog.warning("Unknown reachability - using local IP");
addr = computer.localIp;
}
httpConn = new NvHTTP(addr,
httpConn = new NvHTTP(ServerHelper.getCurrentAddressFromComputer(computer),
managerBinder.getUniqueId(),
PlatformBinding.getDeviceName(),
PlatformBinding.getCryptoProvider(PcView.this));
@@ -448,19 +492,7 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
NvHTTP httpConn;
String message;
try {
InetAddress addr;
if (computer.reachability == ComputerDetails.Reachability.LOCAL) {
addr = computer.localIp;
}
else if (computer.reachability == ComputerDetails.Reachability.REMOTE) {
addr = computer.remoteIp;
}
else {
LimeLog.warning("Unknown reachability - using local IP");
addr = computer.localIp;
}
httpConn = new NvHTTP(addr,
httpConn = new NvHTTP(ServerHelper.getCurrentAddressFromComputer(computer),
managerBinder.getUniqueId(),
PlatformBinding.getDeviceName(),
PlatformBinding.getCryptoProvider(PcView.this));
@@ -547,7 +579,7 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
return true;
}
ServerHelper.doStart(this, new NvApp("app", computer.details.runningGameId), computer.details, managerBinder);
ServerHelper.doStart(this, new NvApp("app", computer.details.runningGameId, false), computer.details, managerBinder);
return true;
case QUIT_ID:
@@ -562,7 +594,7 @@ public class PcView extends Activity implements AdapterFragmentCallbacks {
public void run() {
ServerHelper.doQuit(PcView.this,
ServerHelper.getCurrentAddressFromComputer(computer.details),
new NvApp("app", 0), managerBinder, null);
new NvApp("app", 0, false), managerBinder, null);
}
}, null);
return true;

View File

@@ -1,39 +1,86 @@
package com.limelight.binding.audio;
import android.media.AudioAttributes;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Build;
import com.limelight.LimeLog;
import com.limelight.nvstream.av.audio.AudioRenderer;
import com.limelight.nvstream.jni.MoonBridge;
public class AndroidAudioRenderer implements AudioRenderer {
private AudioTrack track;
@Override
public boolean streamInitialized(int channelCount, int channelMask, int samplesPerFrame, int sampleRate) {
int channelConfig;
int bufferSize;
int bytesPerFrame = (samplesPerFrame * 2);
private AudioTrack createAudioTrack(int channelConfig, int bufferSize, boolean lowLatency) {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) {
return new AudioTrack(AudioManager.STREAM_MUSIC,
48000,
channelConfig,
AudioFormat.ENCODING_PCM_16BIT,
bufferSize,
AudioTrack.MODE_STREAM);
}
else {
AudioAttributes.Builder attributesBuilder = new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_GAME);
AudioFormat format = new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(48000)
.setChannelMask(channelConfig)
.build();
switch (channelCount)
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.O) {
// Use FLAG_LOW_LATENCY on L through N
if (lowLatency) {
attributesBuilder.setFlags(AudioAttributes.FLAG_LOW_LATENCY);
}
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
AudioTrack.Builder trackBuilder = new AudioTrack.Builder()
.setAudioFormat(format)
.setAudioAttributes(attributesBuilder.build())
.setTransferMode(AudioTrack.MODE_STREAM)
.setBufferSizeInBytes(bufferSize);
// Use PERFORMANCE_MODE_LOW_LATENCY on O and later
if (lowLatency) {
trackBuilder.setPerformanceMode(AudioTrack.PERFORMANCE_MODE_LOW_LATENCY);
}
return trackBuilder.build();
}
else {
return new AudioTrack(attributesBuilder.build(),
format,
bufferSize,
AudioTrack.MODE_STREAM,
AudioManager.AUDIO_SESSION_ID_GENERATE);
}
}
}
@Override
public int setup(int audioConfiguration) {
int channelConfig;
int bytesPerFrame;
switch (audioConfiguration)
{
case 1:
channelConfig = AudioFormat.CHANNEL_OUT_MONO;
break;
case 2:
channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
break;
case 4:
channelConfig = AudioFormat.CHANNEL_OUT_QUAD;
break;
case 6:
channelConfig = AudioFormat.CHANNEL_OUT_5POINT1;
break;
default:
LimeLog.severe("Decoder returned unhandled channel count");
return false;
case MoonBridge.AUDIO_CONFIGURATION_STEREO:
channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
bytesPerFrame = 2 * 240 * 2;
break;
case MoonBridge.AUDIO_CONFIGURATION_51_SURROUND:
channelConfig = AudioFormat.CHANNEL_OUT_5POINT1;
bytesPerFrame = 6 * 240 * 2;
break;
default:
LimeLog.severe("Decoder returned unhandled channel count");
return -1;
}
// We're not supposed to request less than the minimum
@@ -41,62 +88,102 @@ public class AndroidAudioRenderer implements AudioRenderer {
// do this on many devices and it lowers audio latency.
// We'll try the small buffer size first and if it fails,
// use the recommended larger buffer size.
try {
// Buffer two frames of audio if possible
bufferSize = bytesPerFrame * 2;
track = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate,
channelConfig,
AudioFormat.ENCODING_PCM_16BIT,
bufferSize,
AudioTrack.MODE_STREAM);
track.play();
} catch (Exception e) {
// Try to release the AudioTrack if we got far enough
try {
if (track != null) {
track.release();
}
} catch (Exception ignored) {}
for (int i = 0; i < 4; i++) {
boolean lowLatency;
int bufferSize;
// Now try the larger buffer size
bufferSize = Math.max(AudioTrack.getMinBufferSize(sampleRate,
// We will try:
// 1) Small buffer, low latency mode
// 2) Large buffer, low latency mode
// 3) Small buffer, standard mode
// 4) Large buffer, standard mode
switch (i) {
case 0:
case 1:
lowLatency = true;
break;
case 2:
case 3:
lowLatency = false;
break;
default:
// Unreachable
throw new IllegalStateException();
}
switch (i) {
case 0:
case 2:
bufferSize = bytesPerFrame * 2;
break;
case 1:
case 3:
// Try the larger buffer size
bufferSize = Math.max(AudioTrack.getMinBufferSize(48000,
channelConfig,
AudioFormat.ENCODING_PCM_16BIT),
bytesPerFrame * 2);
bytesPerFrame * 2);
// Round to next frame
bufferSize = (((bufferSize + (bytesPerFrame - 1)) / bytesPerFrame) * bytesPerFrame);
// Round to next frame
bufferSize = (((bufferSize + (bytesPerFrame - 1)) / bytesPerFrame) * bytesPerFrame);
break;
default:
// Unreachable
throw new IllegalStateException();
}
track = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate,
channelConfig,
AudioFormat.ENCODING_PCM_16BIT,
bufferSize,
AudioTrack.MODE_STREAM);
track.play();
// Skip low latency options if hardware sample rate isn't 48000Hz
if (AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_MUSIC) != 48000 && lowLatency) {
continue;
}
try {
track = createAudioTrack(channelConfig, bufferSize, lowLatency);
track.play();
// Successfully created working AudioTrack. We're done here.
LimeLog.info("Audio track configuration: "+bufferSize+" "+lowLatency);
break;
} catch (Exception e) {
// Try to release the AudioTrack if we got far enough
e.printStackTrace();
try {
if (track != null) {
track.release();
track = null;
}
} catch (Exception ignored) {}
}
}
LimeLog.info("Audio track buffer size: "+bufferSize);
return true;
}
@Override
public void playDecodedAudio(byte[] audioData, int offset, int length) {
track.write(audioData, offset, length);
}
@Override
public void streamClosing() {
if (track != null) {
track.release();
if (track == null) {
// Couldn't create any audio track for playback
return -2;
}
}
@Override
public int getCapabilities() {
return 0;
}
@Override
public void playDecodedAudio(byte[] audioData) {
track.write(audioData, 0, audioData.length);
}
@Override
public void start() {}
@Override
public void stop() {
// Immediately drop all pending data
track.pause();
track.flush();
}
@Override
public void cleanup() {
track.release();
}
}

View File

@@ -63,6 +63,10 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
int[] ids = InputDevice.getDeviceIds();
for (int id : ids) {
InputDevice dev = InputDevice.getDevice(id);
if (dev == null) {
// This device was removed during enumeration
continue;
}
if ((dev.getSources() & InputDevice.SOURCE_JOYSTICK) != 0 ||
(dev.getSources() & InputDevice.SOURCE_GAMEPAD) != 0) {
// This looks like a gamepad, but we'll check X and Y to be sure
@@ -92,6 +96,12 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
defaultContext.rightTriggerAxis = MotionEvent.AXIS_GAS;
defaultContext.controllerNumber = (short) 0;
defaultContext.assignedControllerNumber = true;
// Some devices (GPD XD) have a back button which sends input events
// with device ID == 0. This hits the default context which would normally
// consume these. Instead, let's ignore them since that's probably the
// most likely case.
defaultContext.ignoreBack = true;
}
private static InputDevice.MotionRange getMotionRangeForJoystickAxis(InputDevice dev, int axis) {
@@ -132,7 +142,8 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
private void releaseControllerNumber(GenericControllerContext context) {
// If this device sent data as a gamepad, zero the values before removing
if (context.assignedControllerNumber) {
conn.sendControllerInput(context.controllerNumber, (short) 0,
conn.sendControllerInput(context.controllerNumber, getActiveControllerMask(),
(short) 0,
(byte) 0, (byte) 0,
(short) 0, (short) 0,
(short) 0, (short) 0);
@@ -339,18 +350,6 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
}
}
// Ignore the back buttonn if a controller has both buttons
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT) {
boolean[] hasSelectKey = dev.hasKeys(KeyEvent.KEYCODE_BUTTON_SELECT, KeyEvent.KEYCODE_BACK, 0);
if (hasSelectKey[0] && hasSelectKey[1]) {
// Xiaomi gamepads claim to have both buttons then only send KEYCODE_BACK events
if (dev.getVendorId() != 0x2717) {
LimeLog.info("Ignoring back button because select is present");
context.ignoreBack = true;
}
}
}
// The ADT-1 controller needs a similar fixup to the ASUS Gamepad
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT) {
// The device name provided is just "Gamepad" which is pretty useless, so we
@@ -382,7 +381,7 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
context.triggerDeadzone = 0.30f;
}
// Classify this device as a remote by name
else if (devName.contains("Fire TV Remote") || devName.contains("Nexus Remote")) {
else if (devName.toLowerCase().contains("remote")) {
// It's only a remote if it doesn't any sticks
if (!context.hasJoystickAxes) {
context.ignoreBack = true;
@@ -394,8 +393,10 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
context.rightStickDeadzoneRadius = 0.07f;
}
// Samsung's face buttons appear as a non-virtual button so we'll explicitly ignore
// back presses on this device
else if (devName.equals("sec_touchscreen") || devName.equals("sec_touchkey")) {
// back presses on this device. The Goodix buttons on the Nokia 6 also appear
// non-virtual so we'll ignore those too.
else if (devName.equals("sec_touchscreen") || devName.equals("sec_touchkey") ||
devName.equals("goodix_fp")) {
context.ignoreBack = true;
}
// The Serval has a couple of unknown buttons that are start and select. It also has
@@ -414,9 +415,6 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
if (gasRange == null) {
context.isXboxBtController = true;
}
else {
context.ignoreBack = false;
}
}
}
@@ -473,6 +471,16 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
}
}
private short getActiveControllerMask() {
if (multiControllerEnabled) {
return currentControllers;
}
else {
// Only Player 1 is active with multi-controller disabled
return 1;
}
}
private void sendControllerInputPacket(GenericControllerContext originalContext) {
assignControllerNumberIfNeeded(originalContext);
@@ -552,11 +560,12 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
}
}
conn.sendControllerInput(controllerNumber,
conn.sendControllerInput(controllerNumber, getActiveControllerMask(),
(short)0, (byte)0, (byte)0, (short)0, (short)0, (short)0, (short)0);
}
else {
conn.sendControllerInput(controllerNumber, inputMap,
conn.sendControllerInput(controllerNumber, getActiveControllerMask(),
inputMap,
leftTrigger, rightTrigger,
leftStickX, leftStickY,
rightStickX, rightStickY);
@@ -965,9 +974,17 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
context.inputMap &= ~ControllerPacket.RS_CLK_FLAG;
break;
case KeyEvent.KEYCODE_BUTTON_L2:
if (context.leftTriggerAxis >= 0) {
// Suppress this digital event if an analog trigger is present
return true;
}
context.leftTrigger = 0;
break;
case KeyEvent.KEYCODE_BUTTON_R2:
if (context.rightTriggerAxis >= 0) {
// Suppress this digital event if an analog trigger is present
return true;
}
context.rightTrigger = 0;
break;
default:
@@ -1077,9 +1094,17 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
context.inputMap |= ControllerPacket.RS_CLK_FLAG;
break;
case KeyEvent.KEYCODE_BUTTON_L2:
if (context.leftTriggerAxis >= 0) {
// Suppress this digital event if an analog trigger is present
return true;
}
context.leftTrigger = (byte)0xFF;
break;
case KeyEvent.KEYCODE_BUTTON_R2:
if (context.rightTriggerAxis >= 0) {
// Suppress this digital event if an analog trigger is present
return true;
}
context.rightTrigger = (byte)0xFF;
break;
default:
@@ -1123,6 +1148,9 @@ public class ControllerHandler implements InputManager.InputDeviceListener, UsbD
float rightStickX, float rightStickY,
float leftTrigger, float rightTrigger) {
UsbDeviceContext context = usbDeviceContexts.get(controllerId);
if (context == null) {
return;
}
Vector2d leftStickVector = populateCachedVector(leftStickX, leftStickY);

View File

@@ -2,15 +2,12 @@ package com.limelight.binding.input;
import android.view.KeyEvent;
import com.limelight.nvstream.NvConnection;
import com.limelight.nvstream.input.KeycodeTranslator;
/**
* Class to translate a Android key code into the codes GFE is expecting
* @author Diego Waxemberg
* @author Cameron Gutman
*/
public class KeyboardTranslator extends KeycodeTranslator {
public class KeyboardTranslator {
/**
* GFE's prefix for every key code
@@ -21,22 +18,15 @@ public class KeyboardTranslator extends KeycodeTranslator {
public static final int VK_9 = 57;
public static final int VK_A = 65;
public static final int VK_Z = 90;
public static final int VK_ALT = 18;
public static final int VK_NUMPAD0 = 96;
public static final int VK_BACK_SLASH = 92;
public static final int VK_CAPS_LOCK = 20;
public static final int VK_CLEAR = 12;
public static final int VK_COMMA = 44;
public static final int VK_CONTROL = 17;
public static final int VK_BACK_SPACE = 8;
public static final int VK_EQUALS = 61;
public static final int VK_ESCAPE = 27;
public static final int VK_F1 = 112;
public static final int VK_PERIOD = 46;
public static final int VK_INSERT = 155;
public static final int VK_OPEN_BRACKET = 91;
public static final int VK_WINDOWS = 524;
public static final int VK_MINUS = 45;
public static final int VK_END = 35;
public static final int VK_HOME = 36;
public static final int VK_NUM_LOCK = 144;
@@ -46,7 +36,6 @@ public class KeyboardTranslator extends KeycodeTranslator {
public static final int VK_CLOSE_BRACKET = 93;
public static final int VK_SCROLL_LOCK = 145;
public static final int VK_SEMICOLON = 59;
public static final int VK_SHIFT = 16;
public static final int VK_SLASH = 47;
public static final int VK_SPACE = 32;
public static final int VK_PRINTSCREEN = 154;
@@ -59,27 +48,17 @@ public class KeyboardTranslator extends KeycodeTranslator {
public static final int VK_QUOTE = 222;
public static final int VK_PAUSE = 19;
/**
* Constructs a new translator for the specified connection
* @param conn the connection to which the translated codes are sent
*/
public KeyboardTranslator(NvConnection conn) {
super(conn);
}
/**
* Translates the given keycode and returns the GFE keycode
* @param keycode the code to be translated
* @return a GFE keycode for the given keycode
*/
@Override
public short translate(int keycode) {
public static short translate(int keycode) {
int translated;
/* There seems to be no clean mapping between Android key codes
* and what Nvidia sends over the wire. If someone finds one,
* I'll happily delete this code :)
*/
// This is a poor man's mapping between Android key codes
// and Windows VK_* codes. For all defined VK_ codes, see:
// https://msdn.microsoft.com/en-us/library/windows/desktop/dd375731(v=vs.85).aspx
if (keycode >= KeyEvent.KEYCODE_0 &&
keycode <= KeyEvent.KEYCODE_9) {
translated = (keycode - KeyEvent.KEYCODE_0) + VK_0;
@@ -99,8 +78,11 @@ public class KeyboardTranslator extends KeycodeTranslator {
else {
switch (keycode) {
case KeyEvent.KEYCODE_ALT_LEFT:
translated = 0xA4;
break;
case KeyEvent.KEYCODE_ALT_RIGHT:
translated = VK_ALT;
translated = 0xA5;
break;
case KeyEvent.KEYCODE_BACKSLASH:
@@ -120,8 +102,11 @@ public class KeyboardTranslator extends KeycodeTranslator {
break;
case KeyEvent.KEYCODE_CTRL_LEFT:
translated = 0xA2;
break;
case KeyEvent.KEYCODE_CTRL_RIGHT:
translated = VK_CONTROL;
translated = 0xA3;
break;
case KeyEvent.KEYCODE_DEL:
@@ -141,23 +126,25 @@ public class KeyboardTranslator extends KeycodeTranslator {
break;
case KeyEvent.KEYCODE_FORWARD_DEL:
// Nvidia maps period to delete
translated = VK_PERIOD;
translated = 0x2e;
break;
case KeyEvent.KEYCODE_INSERT:
translated = -1;
translated = 0x2d;
break;
case KeyEvent.KEYCODE_LEFT_BRACKET:
translated = 0xdb;
break;
case KeyEvent.KEYCODE_META_LEFT:
case KeyEvent.KEYCODE_META_RIGHT:
translated = VK_WINDOWS;
translated = 0x5b;
break;
case KeyEvent.KEYCODE_META_RIGHT:
translated = 0x5c;
break;
case KeyEvent.KEYCODE_MINUS:
translated = 0xbd;
break;
@@ -199,8 +186,11 @@ public class KeyboardTranslator extends KeycodeTranslator {
break;
case KeyEvent.KEYCODE_SHIFT_LEFT:
translated = 0xA0;
break;
case KeyEvent.KEYCODE_SHIFT_RIGHT:
translated = VK_SHIFT;
translated = 0xA1;
break;
case KeyEvent.KEYCODE_SLASH:
@@ -247,6 +237,26 @@ public class KeyboardTranslator extends KeycodeTranslator {
case KeyEvent.KEYCODE_BREAK:
translated = VK_PAUSE;
break;
case KeyEvent.KEYCODE_NUMPAD_DIVIDE:
translated = 0x6F;
break;
case KeyEvent.KEYCODE_NUMPAD_MULTIPLY:
translated = 0x6A;
break;
case KeyEvent.KEYCODE_NUMPAD_SUBTRACT:
translated = 0x6D;
break;
case KeyEvent.KEYCODE_NUMPAD_ADD:
translated = 0x6B;
break;
case KeyEvent.KEYCODE_NUMPAD_DOT:
translated = 0x6E;
break;
default:
System.out.println("No key for "+keycode);

View File

@@ -0,0 +1,53 @@
package com.limelight.binding.input.capture;
import android.annotation.TargetApi;
import android.os.Build;
import android.view.InputDevice;
import android.view.MotionEvent;
import android.view.View;
@TargetApi(Build.VERSION_CODES.O)
public class AndroidNativePointerCaptureProvider extends InputCaptureProvider {
private View targetView;
public AndroidNativePointerCaptureProvider(View targetView) {
this.targetView = targetView;
}
public static boolean isCaptureProviderSupported() {
return Build.VERSION.SDK_INT >= Build.VERSION_CODES.O;
}
@Override
public void enableCapture() {
super.enableCapture();
targetView.requestPointerCapture();
}
@Override
public void disableCapture() {
super.disableCapture();
targetView.releasePointerCapture();
}
@Override
public boolean isCapturingActive() {
return targetView.hasPointerCapture();
}
@Override
public boolean eventHasRelativeMouseAxes(MotionEvent event) {
return event.getSource() == InputDevice.SOURCE_MOUSE_RELATIVE;
}
@Override
public float getRelativeAxisX(MotionEvent event) {
return event.getX();
}
@Override
public float getRelativeAxisY(MotionEvent event) {
return event.getY();
}
}

View File

@@ -10,11 +10,11 @@ import android.view.View;
import android.view.ViewGroup;
@TargetApi(Build.VERSION_CODES.N)
public class AndroidCaptureProvider extends InputCaptureProvider {
public class AndroidPointerIconCaptureProvider extends InputCaptureProvider {
private ViewGroup rootViewGroup;
private Context context;
public AndroidCaptureProvider(Activity activity) {
public AndroidPointerIconCaptureProvider(Activity activity) {
this.context = activity;
this.rootViewGroup = (ViewGroup) activity.getWindow().getDecorView();
}
@@ -33,11 +33,13 @@ public class AndroidCaptureProvider extends InputCaptureProvider {
@Override
public void enableCapture() {
super.enableCapture();
setPointerIconOnAllViews(PointerIcon.getSystemIcon(context, PointerIcon.TYPE_NULL));
}
@Override
public void disableCapture() {
super.disableCapture();
setPointerIconOnAllViews(null);
}

View File

@@ -3,25 +3,29 @@ package com.limelight.binding.input.capture;
import android.app.Activity;
import com.limelight.LimeLog;
import com.limelight.binding.input.evdev.EvdevCaptureProvider;
import com.limelight.R;
import com.limelight.binding.input.evdev.EvdevCaptureProviderShim;
import com.limelight.binding.input.evdev.EvdevListener;
public class InputCaptureManager {
public static InputCaptureProvider getInputCaptureProvider(Activity activity, EvdevListener rootListener) {
// Shield capture is preferred because it can capture when the cursor is over
// the system UI. Android N native capture can only capture over views owned
// by the application.
if (ShieldCaptureProvider.isCaptureProviderSupported()) {
if (AndroidNativePointerCaptureProvider.isCaptureProviderSupported()) {
LimeLog.info("Using Android O+ native mouse capture");
return new AndroidNativePointerCaptureProvider(activity.findViewById(R.id.surfaceView));
}
else if (ShieldCaptureProvider.isCaptureProviderSupported()) {
LimeLog.info("Using NVIDIA mouse capture extension");
return new ShieldCaptureProvider(activity);
}
else if (EvdevCaptureProvider.isCaptureProviderSupported()) {
else if (EvdevCaptureProviderShim.isCaptureProviderSupported()) {
LimeLog.info("Using Evdev mouse capture");
return new EvdevCaptureProvider(activity, rootListener);
return EvdevCaptureProviderShim.createEvdevCaptureProvider(activity, rootListener);
}
else if (AndroidCaptureProvider.isCaptureProviderSupported()) {
LimeLog.info("Using Android N+ native mouse capture");
return new AndroidCaptureProvider(activity);
else if (AndroidPointerIconCaptureProvider.isCaptureProviderSupported()) {
// Android N's native capture can't capture over system UI elements
// so we want to only use it if there's no other option.
LimeLog.info("Using Android N+ pointer hiding");
return new AndroidPointerIconCaptureProvider(activity);
}
else {
LimeLog.info("Mouse capture not available");

View File

@@ -3,10 +3,25 @@ package com.limelight.binding.input.capture;
import android.view.MotionEvent;
public abstract class InputCaptureProvider {
public abstract void enableCapture();
public abstract void disableCapture();
protected boolean isCapturing;
public void enableCapture() {
isCapturing = true;
}
public void disableCapture() {
isCapturing = false;
}
public void destroy() {}
public boolean isCapturingEnabled() {
return isCapturing;
}
public boolean isCapturingActive() {
return isCapturing;
}
public boolean eventHasRelativeMouseAxes(MotionEvent event) {
return false;
}

View File

@@ -1,10 +1,4 @@
package com.limelight.binding.input.capture;
public class NullCaptureProvider extends InputCaptureProvider {
@Override
public void enableCapture() {}
@Override
public void disableCapture() {}
}
public class NullCaptureProvider extends InputCaptureProvider {}

View File

@@ -63,11 +63,13 @@ public class ShieldCaptureProvider extends InputCaptureProvider {
@Override
public void enableCapture() {
super.enableCapture();
setCursorVisibility(false);
}
@Override
public void disableCapture() {
super.disableCapture();
setCursorVisibility(true);
}

View File

@@ -11,6 +11,7 @@ import android.hardware.usb.UsbDeviceConnection;
import android.hardware.usb.UsbManager;
import android.os.Binder;
import android.os.Build;
import android.os.Handler;
import android.os.IBinder;
import android.view.InputDevice;
@@ -72,10 +73,22 @@ public class UsbDriverService extends Service implements UsbDriverListener {
// Initial attachment broadcast
if (action.equals(UsbManager.ACTION_USB_DEVICE_ATTACHED)) {
UsbDevice device = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
final UsbDevice device = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
// Continue the state machine
handleUsbDeviceState(device);
// shouldClaimDevice() looks at the kernel's enumerated input
// devices to make its decision about whether to prompt to take
// control of the device. The kernel bringing up the input stack
// may race with this callback and cause us to prompt when the
// kernel is capable of running the device. Let's post a delayed
// message to process this state change to allow the kernel
// some time to bring up the stack.
new Handler().postDelayed(new Runnable() {
@Override
public void run() {
// Continue the state machine
handleUsbDeviceState(device);
}
}, 1000);
}
// Subsequent permission dialog completion intent
else if (action.equals(ACTION_USB_PERMISSION)) {
@@ -151,6 +164,10 @@ public class UsbDriverService extends Service implements UsbDriverListener {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
for (int id : InputDevice.getDeviceIds()) {
InputDevice inputDev = InputDevice.getDevice(id);
if (inputDev == null) {
// Device was removed while looping
continue;
}
if (inputDev.getVendorId() == device.getVendorId() &&
inputDev.getProductId() == device.getProductId()) {

View File

@@ -17,18 +17,22 @@ public class Xbox360Controller extends AbstractXboxController {
0x044f, // Thrustmaster
0x045e, // Microsoft
0x046d, // Logitech
0x056e, // Elecom
0x06a3, // Saitek
0x0738, // Mad Catz
0x07ff, // Mad Catz
0x0e6f, // Unknown
0x0f0d, // Hori
0x11c9, // Nacon
0x12ab, // Unknown
0x1430, // RedOctane
0x146b, // BigBen
0x1bad, // Harmonix
0x0f0d, // Hori
0x1689, // Razer Onza
0x24c6, // PowerA
0x1532, // Razer Sabertooth
0x15e4, // Numark
0x162e, // Joytech
0x1689, // Razer Onza
0x1bad, // Harmonix
0x24c6, // PowerA
};
public static boolean canClaimDevice(UsbDevice device) {

View File

@@ -19,6 +19,7 @@ public class XboxOneController extends AbstractXboxController {
0x0738, // Mad Catz
0x0e6f, // Unknown
0x0f0d, // Hori
0x1532, // Razer Wildcat
0x24c6, // PowerA
};

View File

@@ -0,0 +1,24 @@
package com.limelight.binding.input.evdev;
import android.app.Activity;
import com.limelight.LimelightBuildProps;
import com.limelight.binding.input.capture.InputCaptureProvider;
public class EvdevCaptureProviderShim {
public static boolean isCaptureProviderSupported() {
return LimelightBuildProps.ROOT_BUILD;
}
// We need to construct our capture provider using reflection because it isn't included in non-root builds
public static InputCaptureProvider createEvdevCaptureProvider(Activity activity, EvdevListener listener) {
try {
Class providerClass = Class.forName("com.limelight.binding.input.evdev.EvdevCaptureProvider");
return (InputCaptureProvider) providerClass.getConstructors()[0].newInstance(activity, listener);
} catch (Exception e) {
e.printStackTrace();
throw new RuntimeException(e);
}
}
}

View File

@@ -1,12 +1,12 @@
package com.limelight.binding.input.evdev;
public interface EvdevListener {
public static final int BUTTON_LEFT = 1;
public static final int BUTTON_MIDDLE = 2;
public static final int BUTTON_RIGHT = 3;
int BUTTON_LEFT = 1;
int BUTTON_MIDDLE = 2;
int BUTTON_RIGHT = 3;
public void mouseMove(int deltaX, int deltaY);
public void mouseButtonEvent(int buttonId, boolean down);
public void mouseScroll(byte amount);
public void keyboardEvent(boolean buttonDown, short keyCode);
void mouseMove(int deltaX, int deltaY);
void mouseButtonEvent(int buttonId, boolean down);
void mouseScroll(byte amount);
void keyboardEvent(boolean buttonDown, short keyCode);
}

View File

@@ -190,8 +190,14 @@ public class DigitalButton extends VirtualControllerElement {
for (DigitalButtonListener listener : listeners) {
listener.onRelease();
}
timerLongClick.cancel();
longClickTimerTask.cancel();
// We may be called for a release without a prior click
if (timerLongClick != null) {
timerLongClick.cancel();
}
if (longClickTimerTask != null) {
longClickTimerTask.cancel();
}
}
@Override

View File

@@ -8,6 +8,7 @@ import android.content.Context;
import android.util.DisplayMetrics;
import com.limelight.nvstream.input.ControllerPacket;
import com.limelight.preferences.PreferenceConfiguration;
public class VirtualControllerConfigurationLoader {
private static final String PROFILE_PATH = "profiles";
@@ -146,110 +147,130 @@ public class VirtualControllerConfigurationLoader {
public static void createDefaultLayout(final VirtualController controller, final Context context) {
DisplayMetrics screen = context.getResources().getDisplayMetrics();
PreferenceConfiguration config = PreferenceConfiguration.readPreferences(context);
// NOTE: Some of these getPercent() expressions seem like they can be combined
// into a single call. Due to floating point rounding, this isn't actually possible.
controller.addElement(createDigitalPad(controller, context),
getPercent(5, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(30, screen.widthPixels),
getPercent(40, screen.heightPixels)
);
if (!config.onlyL3R3)
{
controller.addElement(createDigitalPad(controller, context),
getPercent(5, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(30, screen.widthPixels),
getPercent(40, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.A_FLAG, 0, 1, "A", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels) + getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels) + 2 * getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.B_FLAG, 0, 1, "B", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels) + 2 * getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels) + getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.X_FLAG, 0, 1, "X", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels) + getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.Y_FLAG, 0, 1, "Y", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels) + getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createLeftTrigger(
0, "LT", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createRightTrigger(
0, "RT", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels) + 2 * getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.LB_FLAG, 0, 1, "LB", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels) + 2 * getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.RB_FLAG, 0, 1, "RB", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels) + 2 * getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels) + 2 * getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createLeftStick(controller, context),
getPercent(5, screen.widthPixels),
getPercent(50, screen.heightPixels),
getPercent(40, screen.widthPixels),
getPercent(50, screen.heightPixels)
);
controller.addElement(createRightStick(controller, context),
getPercent(55, screen.widthPixels),
getPercent(50, screen.heightPixels),
getPercent(40, screen.widthPixels),
getPercent(50, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.BACK_FLAG, 0, 2, "BACK", -1, controller, context),
getPercent(40, screen.widthPixels),
getPercent(90, screen.heightPixels),
getPercent(10, screen.widthPixels),
getPercent(10, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.PLAY_FLAG, 0, 3, "START", -1, controller, context),
getPercent(40, screen.widthPixels) + getPercent(10, screen.widthPixels),
getPercent(90, screen.heightPixels),
getPercent(10, screen.widthPixels),
getPercent(10, screen.heightPixels)
);
}
controller.addElement(createDigitalButton(
ControllerPacket.A_FLAG, 0, 1, "A", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels)+getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels)+2*getPercent(BUTTON_HEIGHT, screen.heightPixels),
ControllerPacket.LS_CLK_FLAG, 0, 1, "L3", -1, controller, context),
getPercent(2, screen.widthPixels),
getPercent(80, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.B_FLAG, 0, 1, "B", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels)+2*getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels)+getPercent(BUTTON_HEIGHT, screen.heightPixels),
ControllerPacket.RS_CLK_FLAG, 0, 1, "R3", -1, controller, context),
getPercent(89, screen.widthPixels),
getPercent(80, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.X_FLAG, 0, 1, "X", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels)+getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.Y_FLAG, 0, 1, "Y", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels)+getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createLeftTrigger(
0, "LT", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createRightTrigger(
0, "RT", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels)+2*getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.LB_FLAG, 0, 1, "LB", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels)+2*getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.RB_FLAG, 0, 1, "RB", -1, controller, context),
getPercent(BUTTON_BASE_X, screen.widthPixels)+2*getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_BASE_Y, screen.heightPixels)+2*getPercent(BUTTON_HEIGHT, screen.heightPixels),
getPercent(BUTTON_WIDTH, screen.widthPixels),
getPercent(BUTTON_HEIGHT, screen.heightPixels)
);
controller.addElement(createLeftStick(controller, context),
getPercent(5, screen.widthPixels),
getPercent(50, screen.heightPixels),
getPercent(40, screen.widthPixels),
getPercent(50, screen.heightPixels)
);
controller.addElement(createRightStick(controller, context),
getPercent(55, screen.widthPixels),
getPercent(50, screen.heightPixels),
getPercent(40, screen.widthPixels),
getPercent(50, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.BACK_FLAG, 0, 2, "BACK", -1, controller, context),
getPercent(40, screen.widthPixels),
getPercent(90, screen.heightPixels),
getPercent(10, screen.widthPixels),
getPercent(10, screen.heightPixels)
);
controller.addElement(createDigitalButton(
ControllerPacket.PLAY_FLAG, 0, 3, "START", -1, controller, context),
getPercent(40, screen.widthPixels)+getPercent(10, screen.widthPixels),
getPercent(90, screen.heightPixels),
getPercent(10, screen.widthPixels),
getPercent(10, screen.heightPixels)
);
}
/*

View File

@@ -0,0 +1,5 @@
package com.limelight.binding.video;
public interface CrashListener {
void notifyCrash(Exception e);
}

View File

@@ -1,8 +0,0 @@
package com.limelight.binding.video;
import com.limelight.nvstream.av.video.VideoDecoderRenderer;
public abstract class EnhancedDecoderRenderer extends VideoDecoderRenderer {
public abstract boolean isHevcSupported();
public abstract boolean isAvcSupported();
}

View File

@@ -7,9 +7,10 @@ import java.util.Collections;
import java.util.LinkedList;
import java.util.List;
import java.util.Locale;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.ActivityManager;
import android.content.Context;
import android.content.pm.ConfigurationInfo;
@@ -27,11 +28,17 @@ public class MediaCodecHelper {
private static final List<String> blacklistedDecoderPrefixes;
private static final List<String> spsFixupBitstreamFixupDecoderPrefixes;
private static final List<String> whitelistedAdaptiveResolutionPrefixes;
private static final List<String> blacklistedAdaptivePlaybackPrefixes;
private static final List<String> deprioritizedHevcDecoders;
private static final List<String> baselineProfileHackPrefixes;
private static final List<String> directSubmitPrefixes;
private static final List<String> constrainedHighProfilePrefixes;
private static final List<String> whitelistedHevcDecoders;
private static final List<String> refFrameInvalidationAvcPrefixes;
private static final List<String> refFrameInvalidationHevcPrefixes;
private static boolean isLowEndSnapdragon = false;
private static boolean initialized = false;
static {
directSubmitPrefixes = new LinkedList<>();
@@ -45,26 +52,43 @@ public class MediaCodecHelper {
directSubmitPrefixes.add("omx.brcm");
directSubmitPrefixes.add("omx.TI");
directSubmitPrefixes.add("omx.arc");
directSubmitPrefixes.add("omx.nvidia");
}
static {
refFrameInvalidationAvcPrefixes = new LinkedList<>();
refFrameInvalidationHevcPrefixes = new LinkedList<>();
// Qualcomm and NVIDIA may be added at runtime
}
static {
preferredDecoders = new LinkedList<>();
}
static {
blacklistedDecoderPrefixes = new LinkedList<>();
// Software decoders that don't support H264 high profile
blacklistedDecoderPrefixes.add("omx.google");
blacklistedDecoderPrefixes.add("AVCDecoder");
// Without bitstream fixups, we perform horribly on NVIDIA's HEVC
// decoder. While not strictly necessary, I'm going to fully blacklist this
// one to avoid users getting inaccurate impressions of Tegra X1/Moonlight performance.
blacklistedDecoderPrefixes.add("OMX.Nvidia.h265.decode");
// Blacklist software decoders that don't support H264 high profile,
// but exclude the official AOSP emulator from this restriction.
if (!Build.HARDWARE.equals("ranchu") || !Build.BRAND.equals("google")) {
blacklistedDecoderPrefixes.add("omx.google");
blacklistedDecoderPrefixes.add("AVCDecoder");
}
// Never use ffmpeg decoders since they're software decoders
blacklistedDecoderPrefixes.add("OMX.ffmpeg");
// Force these decoders disabled because:
// 1) They are software decoders, so the performance is terrible
// 2) They crash with our HEVC stream anyway (at least prior to CSD batching)
blacklistedDecoderPrefixes.add("OMX.qcom.video.decoder.hevcswvdec");
blacklistedDecoderPrefixes.add("OMX.SEC.hevc.sw.dec");
}
static {
// If a decoder qualifies for reference frame invalidation,
// these entries will be ignored for those decoders.
spsFixupBitstreamFixupDecoderPrefixes = new LinkedList<>();
spsFixupBitstreamFixupDecoderPrefixes.add("omx.nvidia");
spsFixupBitstreamFixupDecoderPrefixes.add("omx.qcom");
@@ -73,11 +97,13 @@ public class MediaCodecHelper {
baselineProfileHackPrefixes = new LinkedList<>();
baselineProfileHackPrefixes.add("omx.intel");
whitelistedAdaptiveResolutionPrefixes = new LinkedList<>();
whitelistedAdaptiveResolutionPrefixes.add("omx.nvidia");
whitelistedAdaptiveResolutionPrefixes.add("omx.qcom");
whitelistedAdaptiveResolutionPrefixes.add("omx.sec");
whitelistedAdaptiveResolutionPrefixes.add("omx.TI");
blacklistedAdaptivePlaybackPrefixes = new LinkedList<>();
// The Intel decoder on Lollipop on Nexus Player would increase latency badly
// if adaptive playback was enabled so let's avoid it to be safe.
blacklistedAdaptivePlaybackPrefixes.add("omx.intel");
// The MediaTek decoder crashes at 1080p when adaptive playback is enabled
// on some Android TV devices with H.265 only.
blacklistedAdaptivePlaybackPrefixes.add("omx.mtk");
constrainedHighProfilePrefixes = new LinkedList<>();
constrainedHighProfilePrefixes.add("omx.intel");
@@ -86,11 +112,22 @@ public class MediaCodecHelper {
static {
whitelistedHevcDecoders = new LinkedList<>();
// Allow software HEVC decoding in the official AOSP emulator
if (Build.HARDWARE.equals("ranchu") && Build.BRAND.equals("google")) {
whitelistedHevcDecoders.add("omx.google");
}
// Exynos seems to be the only HEVC decoder that works reliably
whitelistedHevcDecoders.add("omx.exynos");
// TODO: This needs a similar fixup to the Tegra 3 otherwise it buffers 16 frames
//whitelistedHevcDecoders.add("omx.nvidia");
// On Darcy (Shield 2017), HEVC runs fine with no fixups required.
// For some reason, other X1 implementations require bitstream fixups.
if (Build.DEVICE.equalsIgnoreCase("darcy")) {
whitelistedHevcDecoders.add("omx.nvidia");
}
else {
// TODO: This needs a similar fixup to the Tegra 3 otherwise it buffers 16 frames
}
// Sony ATVs have broken MediaTek codecs (decoder hangs after rendering the first frame).
// I know the Fire TV 2 works, so I'll just whitelist Amazon devices which seem
@@ -108,11 +145,64 @@ public class MediaCodecHelper {
// during initialization to avoid SoCs with broken HEVC decoders.
}
public static void initializeWithContext(Context context) {
static {
deprioritizedHevcDecoders = new LinkedList<>();
// These are decoders that work but aren't used by default for various reasons.
// Qualcomm is currently the only decoders in this group.
}
private static boolean isLowEndSnapdragonRenderer(String glRenderer) {
glRenderer = glRenderer.toLowerCase().trim();
if (!glRenderer.contains("adreno")) {
return false;
}
Pattern modelNumberPattern = Pattern.compile("(.*)([0-9]{3})(.*)");
Matcher matcher = modelNumberPattern.matcher(glRenderer);
if (!matcher.matches()) {
return false;
}
String modelNumber = matcher.group(2);
LimeLog.info("Found Adreno GPU: "+modelNumber);
// The current logic is to identify low-end SoCs based on a zero in the x0x place.
return modelNumber.charAt(1) == '0';
}
public static void initialize(Context context, String glRenderer) {
if (initialized) {
return;
}
ActivityManager activityManager =
(ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);
ConfigurationInfo configInfo = activityManager.getDeviceConfigurationInfo();
if (configInfo.reqGlEsVersion != ConfigurationInfo.GL_ES_VERSION_UNDEFINED) {
LimeLog.info("OpenGL ES version: "+configInfo.reqGlEsVersion);
isLowEndSnapdragon = isLowEndSnapdragonRenderer(glRenderer);
// Tegra K1 and later can do reference frame invalidation properly
if (configInfo.reqGlEsVersion >= 0x30000) {
LimeLog.info("Added omx.nvidia to AVC reference frame invalidation support list");
refFrameInvalidationAvcPrefixes.add("omx.nvidia");
LimeLog.info("Added omx.qcom to AVC reference frame invalidation support list");
refFrameInvalidationAvcPrefixes.add("omx.qcom");
// Prior to M, we were tricking the decoder into using baseline profile, which
// won't support RFI properly.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
LimeLog.info("Added omx.intel to AVC reference frame invalidation support list");
refFrameInvalidationAvcPrefixes.add("omx.intel");
}
}
// Qualcomm's early HEVC decoders break hard on our HEVC stream. The best check to
// tell the good from the bad decoders are the generation of Adreno GPU included:
// 3xx - bad
@@ -120,15 +210,27 @@ public class MediaCodecHelper {
//
// Unfortunately, it's not that easy to get that information here, so I'll use an
// approximation by checking the GLES level (<= 3.0 is bad).
LimeLog.info("OpenGL ES version: "+configInfo.reqGlEsVersion);
if (configInfo.reqGlEsVersion > 0x30000) {
LimeLog.info("Added omx.qcom to supported decoders based on GLES 3.1+ support");
whitelistedHevcDecoders.add("omx.qcom");
// We prefer reference frame invalidation support (which is only doable on AVC on
// older Qualcomm chips) vs. enabling HEVC by default. The user can override using the settings
// to force HEVC on. If HDR or mobile data will be used, we'll override this and use
// HEVC anyway.
LimeLog.info("Added omx.qcom to deprioritized HEVC decoders based on GLES 3.1+ support");
deprioritizedHevcDecoders.add("omx.qcom");
}
else {
blacklistedDecoderPrefixes.add("OMX.qcom.video.decoder.hevc");
}
}
}
initialized = true;
}
private static boolean isDecoderInList(List<String> decoderList, String decoderName) {
if (!initialized) {
throw new IllegalStateException("MediaCodecHelper must be initialized before use");
}
for (String badPrefix : decoderList) {
if (decoderName.length() >= badPrefix.length()) {
String prefix = decoderName.substring(0, badPrefix.length());
@@ -145,19 +247,14 @@ public class MediaCodecHelper {
return System.nanoTime() / 1000000L;
}
@TargetApi(Build.VERSION_CODES.KITKAT)
public static boolean decoderSupportsAdaptivePlayback(String decoderName) {
/*
FIXME: Intel's decoder on Nexus Player forces the high latency path if adaptive playback is enabled
so we'll keep it off for now, since we don't know whether other devices also do the same
if (isDecoderInList(whitelistedAdaptiveResolutionPrefixes, decoderName)) {
LimeLog.info("Adaptive playback supported (whitelist)");
return true;
}
public static boolean decoderSupportsAdaptivePlayback(MediaCodecInfo decoderInfo) {
// Possibly enable adaptive playback on KitKat and above
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
if (isDecoderInList(blacklistedAdaptivePlaybackPrefixes, decoderInfo.getName())) {
LimeLog.info("Decoder blacklisted for adaptive playback");
return false;
}
try {
if (decoderInfo.getCapabilitiesForType("video/avc").
isFeatureSupported(CodecCapabilities.FEATURE_AdaptivePlayback))
@@ -169,7 +266,7 @@ public class MediaCodecHelper {
} catch (Exception e) {
// Tolerate buggy codecs
}
}*/
}
return false;
}
@@ -190,7 +287,19 @@ public class MediaCodecHelper {
return isDecoderInList(baselineProfileHackPrefixes, decoderName);
}
public static boolean decoderIsWhitelistedForHevc(String decoderName) {
public static boolean decoderSupportsRefFrameInvalidationAvc(String decoderName, int videoHeight) {
// Reference frame invalidation is broken on low-end Snapdragon SoCs at 1080p.
if (videoHeight > 720 && isLowEndSnapdragon) {
return false;
}
return isDecoderInList(refFrameInvalidationAvcPrefixes, decoderName);
}
public static boolean decoderSupportsRefFrameInvalidationHevc(String decoderName) {
return isDecoderInList(refFrameInvalidationHevcPrefixes, decoderName);
}
public static boolean decoderIsWhitelistedForHevc(String decoderName, boolean meteredData, boolean willStreamHdr) {
// TODO: Shield Tablet K1/LTE?
//
// NVIDIA does partial HEVC acceleration on the Shield Tablet. I don't know
@@ -212,6 +321,25 @@ public class MediaCodecHelper {
return false;
}
//
// Software decoders are terrible and we never want to use them.
// We want to catch decoders like:
// OMX.qcom.video.decoder.hevcswvdec
// OMX.SEC.hevc.sw.dec
//
if (decoderName.contains("sw")) {
return false;
}
// Some devices have HEVC decoders that we prefer not to use
// typically because it can't support reference frame invalidation.
// However, we will use it for HDR and for streaming over mobile networks
// since it works fine otherwise.
if ((meteredData || willStreamHdr) && isDecoderInList(deprioritizedHevcDecoders, decoderName)) {
LimeLog.info("Selected deprioritized decoder");
return true;
}
return isDecoderInList(whitelistedHevcDecoders, decoderName);
}
@@ -259,6 +387,10 @@ public class MediaCodecHelper {
// This is a different algorithm than the other findXXXDecoder functions,
// because we want to evaluate the decoders in our list's order
// rather than MediaCodecList's order
if (!initialized) {
throw new IllegalStateException("MediaCodecHelper must be initialized before use");
}
for (String preferredDecoder : preferredDecoders) {
for (MediaCodecInfo codecInfo : getMediaCodecList()) {

View File

@@ -25,6 +25,8 @@ public class ComputerDatabaseManager {
private static final String REMOTE_IP_COLUMN_NAME = "RemoteIp";
private static final String MAC_COLUMN_NAME = "Mac";
private static final String ADDRESS_PREFIX = "ADDRESS_PREFIX__";
private SQLiteDatabase computerDb;
public ComputerDatabaseManager(Context c) {
@@ -60,50 +62,74 @@ public class ComputerDatabaseManager {
ContentValues values = new ContentValues();
values.put(COMPUTER_NAME_COLUMN_NAME, details.name);
values.put(COMPUTER_UUID_COLUMN_NAME, details.uuid.toString());
values.put(LOCAL_IP_COLUMN_NAME, details.localIp.getAddress());
values.put(REMOTE_IP_COLUMN_NAME, details.remoteIp.getAddress());
values.put(LOCAL_IP_COLUMN_NAME, ADDRESS_PREFIX+details.localAddress);
values.put(REMOTE_IP_COLUMN_NAME, ADDRESS_PREFIX+details.remoteAddress);
values.put(MAC_COLUMN_NAME, details.macAddress);
return -1 != computerDb.insertWithOnConflict(COMPUTER_TABLE_NAME, null, values, SQLiteDatabase.CONFLICT_REPLACE);
}
private ComputerDetails getComputerFromCursor(Cursor c) {
ComputerDetails details = new ComputerDetails();
details.name = c.getString(0);
String uuidStr = c.getString(1);
try {
details.uuid = UUID.fromString(uuidStr);
} catch (IllegalArgumentException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted UUID for "+details.name);
}
// An earlier schema defined addresses as byte blobs. We'll
// gracefully migrate those to strings so we can store DNS names
// too. To disambiguate, we'll need to prefix them with a string
// greater than the allowable IP address length.
try {
details.localAddress = InetAddress.getByAddress(c.getBlob(2)).getHostAddress();
LimeLog.warning("DB: Legacy local address for "+details.name);
} catch (UnknownHostException e) {
// This is probably a hostname/address with the prefix string
String stringData = c.getString(2);
if (stringData.startsWith(ADDRESS_PREFIX)) {
details.localAddress = c.getString(2).substring(ADDRESS_PREFIX.length());
}
else {
LimeLog.severe("DB: Corrupted local address for "+details.name);
}
}
try {
details.remoteAddress = InetAddress.getByAddress(c.getBlob(3)).getHostAddress();
LimeLog.warning("DB: Legacy remote address for "+details.name);
} catch (UnknownHostException e) {
// This is probably a hostname/address with the prefix string
String stringData = c.getString(3);
if (stringData.startsWith(ADDRESS_PREFIX)) {
details.remoteAddress = c.getString(3).substring(ADDRESS_PREFIX.length());
}
else {
LimeLog.severe("DB: Corrupted local address for "+details.name);
}
}
details.macAddress = c.getString(4);
// This signifies we don't have dynamic state (like pair state)
details.state = ComputerDetails.State.UNKNOWN;
details.reachability = ComputerDetails.Reachability.UNKNOWN;
return details;
}
public List<ComputerDetails> getAllComputers() {
Cursor c = computerDb.rawQuery("SELECT * FROM "+COMPUTER_TABLE_NAME, null);
LinkedList<ComputerDetails> computerList = new LinkedList<>();
while (c.moveToNext()) {
ComputerDetails details = new ComputerDetails();
details.name = c.getString(0);
String uuidStr = c.getString(1);
try {
details.uuid = UUID.fromString(uuidStr);
} catch (IllegalArgumentException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted UUID for "+details.name);
}
try {
details.localIp = InetAddress.getByAddress(c.getBlob(2));
} catch (UnknownHostException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted local IP for "+details.name);
}
try {
details.remoteIp = InetAddress.getByAddress(c.getBlob(3));
} catch (UnknownHostException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted remote IP for "+details.name);
}
details.macAddress = c.getString(4);
// This signifies we don't have dynamic state (like pair state)
details.state = ComputerDetails.State.UNKNOWN;
details.reachability = ComputerDetails.Reachability.UNKNOWN;
ComputerDetails details = getComputerFromCursor(c);
// If a field is corrupt or missing, skip the database entry
if (details.uuid == null || details.localIp == null || details.remoteIp == null ||
if (details.uuid == null || details.localAddress == null || details.remoteAddress == null ||
details.macAddress == null) {
continue;
}
@@ -119,46 +145,17 @@ public class ComputerDatabaseManager {
public ComputerDetails getComputerByName(String name) {
Cursor c = computerDb.query(COMPUTER_TABLE_NAME, null, COMPUTER_NAME_COLUMN_NAME+"=?", new String[]{name}, null, null, null);
ComputerDetails details = new ComputerDetails();
if (!c.moveToFirst()) {
// No matching computer
c.close();
return null;
}
details.name = c.getString(0);
String uuidStr = c.getString(1);
try {
details.uuid = UUID.fromString(uuidStr);
} catch (IllegalArgumentException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted UUID for "+details.name);
}
try {
details.localIp = InetAddress.getByAddress(c.getBlob(2));
} catch (UnknownHostException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted local IP for "+details.name);
}
try {
details.remoteIp = InetAddress.getByAddress(c.getBlob(3));
} catch (UnknownHostException e) {
// We'll delete this entry
LimeLog.severe("DB: Corrupted remote IP for "+details.name);
}
details.macAddress = c.getString(4);
ComputerDetails details = getComputerFromCursor(c);
c.close();
details.state = ComputerDetails.State.UNKNOWN;
details.reachability = ComputerDetails.Reachability.UNKNOWN;
// If a field is corrupt or missing, delete the database entry
if (details.uuid == null || details.localIp == null || details.remoteIp == null ||
if (details.uuid == null || details.localAddress == null || details.remoteAddress == null ||
details.macAddress == null) {
deleteComputer(details.name);
return null;

View File

@@ -3,5 +3,5 @@ package com.limelight.computers;
import com.limelight.nvstream.http.ComputerDetails;
public interface ComputerManagerListener {
public void notifyComputerUpdated(ComputerDetails details);
void notifyComputerUpdated(ComputerDetails details);
}

View File

@@ -3,7 +3,6 @@ package com.limelight.computers;
import java.io.IOException;
import java.io.OutputStream;
import java.io.StringReader;
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.util.LinkedList;
@@ -20,6 +19,7 @@ import com.limelight.nvstream.http.NvHTTP;
import com.limelight.nvstream.mdns.MdnsComputer;
import com.limelight.nvstream.mdns.MdnsDiscoveryListener;
import com.limelight.utils.CacheHelper;
import com.limelight.utils.ServerHelper;
import android.app.Service;
import android.content.ComponentName;
@@ -154,7 +154,7 @@ public class ComputerManagerService extends Service {
}
}
};
t.setName("Polling thread for " + tuple.computer.localIp.getHostAddress());
t.setName("Polling thread for " + tuple.computer.localAddress);
return t;
}
@@ -210,8 +210,8 @@ public class ComputerManagerService extends Service {
}
}
public boolean addComputerBlocking(InetAddress addr) {
return ComputerManagerService.this.addComputerBlocking(addr);
public boolean addComputerBlocking(String addr, boolean manuallyAdded) {
return ComputerManagerService.this.addComputerBlocking(addr, manuallyAdded);
}
public void removeComputer(String name) {
@@ -289,7 +289,7 @@ public class ComputerManagerService extends Service {
@Override
public void notifyComputerAdded(MdnsComputer computer) {
// Kick off a serverinfo poll on this machine
addComputerBlocking(computer.getAddress());
addComputerBlocking(computer.getAddress().getHostAddress(), false);
}
@Override
@@ -305,15 +305,22 @@ public class ComputerManagerService extends Service {
};
}
private void addTuple(ComputerDetails details) {
private void addTuple(ComputerDetails details, boolean manuallyAdded) {
synchronized (pollingTuples) {
for (PollingTuple tuple : pollingTuples) {
// Check if this is the same computer
if (tuple.computer.uuid.equals(details.uuid)) {
// Update details anyway in case this machine has been re-added by IP
// after not being reachable by our existing information
tuple.computer.localIp = details.localIp;
tuple.computer.remoteIp = details.remoteIp;
if (manuallyAdded) {
// Update details anyway in case this machine has been re-added by IP
// after not being reachable by our existing information
tuple.computer.localAddress = details.localAddress;
tuple.computer.remoteAddress = details.remoteAddress;
}
else {
// This indicates that mDNS discovered this address, so we
// should only apply the local address.
tuple.computer.localAddress = details.localAddress;
}
// Start a polling thread if polling is active
if (pollingActive && tuple.thread == null) {
@@ -338,11 +345,11 @@ public class ComputerManagerService extends Service {
}
}
public boolean addComputerBlocking(InetAddress addr) {
public boolean addComputerBlocking(String addr, boolean manuallyAdded) {
// Setup a placeholder
ComputerDetails fakeDetails = new ComputerDetails();
fakeDetails.localIp = addr;
fakeDetails.remoteIp = addr;
fakeDetails.localAddress = addr;
fakeDetails.remoteAddress = addr;
// Block while we try to fill the details
try {
@@ -356,7 +363,7 @@ public class ComputerManagerService extends Service {
LimeLog.info("New PC ("+fakeDetails.name+") is UUID "+fakeDetails.uuid);
// Start a polling thread for this machine
addTuple(fakeDetails);
addTuple(fakeDetails, manuallyAdded);
return true;
}
else {
@@ -404,14 +411,14 @@ public class ComputerManagerService extends Service {
}
}
private ComputerDetails tryPollIp(ComputerDetails details, InetAddress ipAddr) {
private ComputerDetails tryPollIp(ComputerDetails details, String address) {
// Fast poll this address first to determine if we can connect at the TCP layer
if (!fastPollIp(ipAddr)) {
if (!fastPollIp(address)) {
return null;
}
try {
NvHTTP http = new NvHTTP(ipAddr, idManager.getUniqueId(),
NvHTTP http = new NvHTTP(address, idManager.getUniqueId(),
null, PlatformBinding.getCryptoProvider(ComputerManagerService.this));
ComputerDetails newDetails = http.getComputerDetails();
@@ -432,10 +439,10 @@ public class ComputerManagerService extends Service {
// Just try to establish a TCP connection to speculatively detect a running
// GFE server
private boolean fastPollIp(InetAddress addr) {
private boolean fastPollIp(String address) {
Socket s = new Socket();
try {
s.connect(new InetSocketAddress(addr, NvHTTP.HTTPS_PORT), FAST_POLL_TIMEOUT);
s.connect(new InetSocketAddress(address, NvHTTP.HTTPS_PORT), FAST_POLL_TIMEOUT);
s.close();
return true;
} catch (IOException e) {
@@ -443,11 +450,11 @@ public class ComputerManagerService extends Service {
}
}
private void startFastPollThread(final InetAddress addr, final boolean[] info) {
private void startFastPollThread(final String address, final boolean[] info) {
Thread t = new Thread() {
@Override
public void run() {
boolean pollRes = fastPollIp(addr);
boolean pollRes = fastPollIp(address);
synchronized (info) {
info[0] = true; // Done
@@ -457,16 +464,16 @@ public class ComputerManagerService extends Service {
}
}
};
t.setName("Fast Poll - "+addr.getHostAddress());
t.setName("Fast Poll - "+address);
t.start();
}
private ComputerDetails.Reachability fastPollPc(final InetAddress local, final InetAddress remote) throws InterruptedException {
private ComputerDetails.Reachability fastPollPc(final String localAddress, final String remoteAddress) throws InterruptedException {
final boolean[] remoteInfo = new boolean[2];
final boolean[] localInfo = new boolean[2];
startFastPollThread(local, localInfo);
startFastPollThread(remote, remoteInfo);
startFastPollThread(localAddress, localInfo);
startFastPollThread(remoteAddress, remoteInfo);
// Check local first
synchronized (localInfo) {
@@ -499,13 +506,13 @@ public class ComputerManagerService extends Service {
// If the local address is routable across the Internet,
// always consider this PC remote to be conservative
if (details.localIp.equals(details.remoteIp)) {
if (details.localAddress.equals(details.remoteAddress)) {
reachability = ComputerDetails.Reachability.REMOTE;
}
else {
// Do a TCP-level connection to the HTTP server to see if it's listening
LimeLog.info("Starting fast poll for "+details.name+" ("+details.localIp+", "+details.remoteIp+")");
reachability = fastPollPc(details.localIp, details.remoteIp);
LimeLog.info("Starting fast poll for "+details.name+" ("+details.localAddress +", "+details.remoteAddress +")");
reachability = fastPollPc(details.localAddress, details.remoteAddress);
LimeLog.info("Fast poll for "+details.name+" returned "+reachability.toString());
// If no connection could be established to either IP address, there's nothing we can do
@@ -517,39 +524,39 @@ public class ComputerManagerService extends Service {
boolean localFirst = (reachability == ComputerDetails.Reachability.LOCAL);
if (localFirst) {
polledDetails = tryPollIp(details, details.localIp);
polledDetails = tryPollIp(details, details.localAddress);
}
else {
polledDetails = tryPollIp(details, details.remoteIp);
polledDetails = tryPollIp(details, details.remoteAddress);
}
InetAddress reachableAddr = null;
if (polledDetails == null && !details.localIp.equals(details.remoteIp)) {
String reachableAddr = null;
if (polledDetails == null && !details.localAddress.equals(details.remoteAddress)) {
// Failed, so let's try the fallback
if (!localFirst) {
polledDetails = tryPollIp(details, details.localIp);
polledDetails = tryPollIp(details, details.localAddress);
}
else {
polledDetails = tryPollIp(details, details.remoteIp);
polledDetails = tryPollIp(details, details.remoteAddress);
}
if (polledDetails != null) {
// The fallback poll worked
reachableAddr = !localFirst ? details.localIp : details.remoteIp;
reachableAddr = !localFirst ? details.localAddress : details.remoteAddress;
}
}
else if (polledDetails != null) {
reachableAddr = localFirst ? details.localIp : details.remoteIp;
reachableAddr = localFirst ? details.localAddress : details.remoteAddress;
}
if (reachableAddr == null) {
return null;
}
if (polledDetails.remoteIp.equals(reachableAddr)) {
if (polledDetails.remoteAddress.equals(reachableAddr)) {
polledDetails.reachability = ComputerDetails.Reachability.REMOTE;
}
else if (polledDetails.localIp.equals(reachableAddr)) {
else if (polledDetails.localAddress.equals(reachableAddr)) {
polledDetails.reachability = ComputerDetails.Reachability.LOCAL;
}
else {
@@ -571,7 +578,7 @@ public class ComputerManagerService extends Service {
ReachabilityTuple confirmationReachTuple = pollForReachability(initialReachTuple.computer);
if (confirmationReachTuple == null) {
// Neither of those seem to work, so we'll hold onto the address that did work
initialReachTuple.computer.localIp = initialReachTuple.reachableAddress;
initialReachTuple.computer.localAddress = initialReachTuple.reachableAddress;
initialReachTuple.computer.reachability = ComputerDetails.Reachability.LOCAL;
}
else {
@@ -581,8 +588,11 @@ public class ComputerManagerService extends Service {
}
}
// Save the old MAC address
// Save some details about the old state of the PC that we may wish
// to restore later.
String savedMacAddress = details.macAddress;
String savedLocalAddress = details.localAddress;
String savedRemoteAddress = details.remoteAddress;
// If we got here, it's reachable
details.update(initialReachTuple.computer);
@@ -593,6 +603,33 @@ public class ComputerManagerService extends Service {
details.macAddress = savedMacAddress;
}
// We never want to lose IP addresses by polling server info. If we get a poll back
// where localAddress == remoteAddress but savedLocalAddress != savedRemoteAddress,
// then we've lost an address in the polling and we should restore the one that's missing.
if (details.localAddress.equals(details.remoteAddress) &&
!savedLocalAddress.equals(savedRemoteAddress)) {
if (details.localAddress.equals(savedLocalAddress)) {
// Local addresses are identical, so put the old remote address back
details.remoteAddress = savedRemoteAddress;
}
else if (details.remoteAddress.equals(savedRemoteAddress)) {
// Remote addresses are identical, so put the old local address back
details.localAddress = savedLocalAddress;
}
else {
// Neither IP address match. Let's restore the remote address to be safe.
details.remoteAddress = savedRemoteAddress;
}
// Now update the reachability so the correct address is used
if (details.localAddress.equals(initialReachTuple.reachableAddress)) {
details.reachability = ComputerDetails.Reachability.LOCAL;
}
else {
details.reachability = ComputerDetails.Reachability.REMOTE;
}
}
return true;
}
@@ -616,7 +653,7 @@ public class ComputerManagerService extends Service {
for (ComputerDetails computer : dbManager.getAllComputers()) {
// Add tuples for each computer
addTuple(computer);
addTuple(computer, true);
}
releaseLocalDatabaseReference();
@@ -694,8 +731,6 @@ public class ComputerManagerService extends Service {
public void run() {
int emptyAppListResponses = 0;
do {
InetAddress selectedAddr;
// Can't poll if it's not online
if (computer.state != ComputerDetails.State.ONLINE) {
if (listener != null) {
@@ -709,19 +744,12 @@ public class ComputerManagerService extends Service {
continue;
}
if (computer.reachability == ComputerDetails.Reachability.LOCAL) {
selectedAddr = computer.localIp;
}
else {
selectedAddr = computer.remoteIp;
}
NvHTTP http = new NvHTTP(selectedAddr, idManager.getUniqueId(),
null, PlatformBinding.getCryptoProvider(ComputerManagerService.this));
PollingTuple tuple = getPollingTuple(computer);
try {
NvHTTP http = new NvHTTP(ServerHelper.getCurrentAddressFromComputer(computer), idManager.getUniqueId(),
null, PlatformBinding.getCryptoProvider(ComputerManagerService.this));
String appList;
if (tuple != null) {
// If we're polling this machine too, grab the network lock
@@ -816,10 +844,10 @@ class PollingTuple {
}
class ReachabilityTuple {
public final InetAddress reachableAddress;
public final String reachableAddress;
public final ComputerDetails computer;
public ReachabilityTuple(ComputerDetails computer, InetAddress reachableAddress) {
public ReachabilityTuple(ComputerDetails computer, String reachableAddress) {
this.computer = computer;
this.reachableAddress = reachableAddress;
}

View File

@@ -42,7 +42,7 @@ public class DiscoveryService extends Service {
@Override
public void onCreate() {
WifiManager wifiMgr = (WifiManager) getSystemService(Context.WIFI_SERVICE);
WifiManager wifiMgr = (WifiManager) getApplicationContext().getSystemService(Context.WIFI_SERVICE);
multicastLock = wifiMgr.createMulticastLock("Limelight mDNS");
multicastLock.setReferenceCounted(false);

View File

@@ -55,10 +55,10 @@ public abstract class GenericGridAdapter<T> extends BaseAdapter {
convertView = inflater.inflate(layoutId, viewGroup, false);
}
ImageView imgView = (ImageView) convertView.findViewById(R.id.grid_image);
ImageView overlayView = (ImageView) convertView.findViewById(R.id.grid_overlay);
TextView txtView = (TextView) convertView.findViewById(R.id.grid_text);
ProgressBar prgView = (ProgressBar) convertView.findViewById(R.id.grid_spinner);
ImageView imgView = convertView.findViewById(R.id.grid_image);
ImageView overlayView = convertView.findViewById(R.id.grid_overlay);
TextView txtView = convertView.findViewById(R.id.grid_text);
ProgressBar prgView = convertView.findViewById(R.id.grid_spinner);
if (imgView != null) {
if (!populateImageView(imgView, prgView, itemList.get(i))) {

View File

@@ -4,12 +4,11 @@ import android.content.Context;
import com.limelight.LimeLog;
import com.limelight.binding.PlatformBinding;
import com.limelight.nvstream.http.ComputerDetails;
import com.limelight.nvstream.http.NvHTTP;
import com.limelight.utils.ServerHelper;
import java.io.IOException;
import java.io.InputStream;
import java.net.InetAddress;
public class NetworkAssetLoader {
private final Context context;
@@ -21,10 +20,9 @@ public class NetworkAssetLoader {
}
public InputStream getBitmapStream(CachedAppAssetLoader.LoaderTuple tuple) {
NvHTTP http = new NvHTTP(getCurrentAddress(tuple.computer), uniqueId, null, PlatformBinding.getCryptoProvider(context));
InputStream in = null;
try {
NvHTTP http = new NvHTTP(ServerHelper.getCurrentAddressFromComputer(tuple.computer), uniqueId, null, PlatformBinding.getCryptoProvider(context));
in = http.getBoxArt(tuple.app);
} catch (IOException ignored) {}
@@ -37,13 +35,4 @@ public class NetworkAssetLoader {
return in;
}
private static InetAddress getCurrentAddress(ComputerDetails computer) {
if (computer.reachability == ComputerDetails.Reachability.LOCAL) {
return computer.localIp;
}
else {
return computer.remoteIp;
}
}
}

View File

@@ -1,8 +1,5 @@
package com.limelight.preferences;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.Locale;
import java.util.concurrent.LinkedBlockingQueue;
import com.limelight.computers.ComputerManagerService;
@@ -17,7 +14,6 @@ import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.res.Configuration;
import android.os.Bundle;
import android.os.IBinder;
import android.view.KeyEvent;
@@ -50,18 +46,12 @@ public class AddComputerManually extends Activity {
SpinnerDialog dialog = SpinnerDialog.displayDialog(this, getResources().getString(R.string.title_add_pc),
getResources().getString(R.string.msg_add_pc), false);
try {
InetAddress addr = InetAddress.getByName(host);
if (!managerBinder.addComputerBlocking(addr)){
msg = getResources().getString(R.string.addpc_fail);
}
else {
msg = getResources().getString(R.string.addpc_success);
finish = true;
}
} catch (UnknownHostException e) {
msg = getResources().getString(R.string.addpc_unknown_host);
if (!managerBinder.addComputerBlocking(host, true)){
msg = getResources().getString(R.string.addpc_fail);
}
else {
msg = getResources().getString(R.string.addpc_success);
finish = true;
}
dialog.dismiss();
@@ -136,18 +126,13 @@ public class AddComputerManually extends Activity {
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
String locale = PreferenceConfiguration.readPreferences(this).language;
if (!locale.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(getResources().getConfiguration());
config.locale = new Locale(locale);
getResources().updateConfiguration(config, getResources().getDisplayMetrics());
}
UiHelper.setLocale(this);
setContentView(R.layout.activity_add_computer_manually);
UiHelper.notifyNewRootView(this);
this.hostText = (TextView) findViewById(R.id.hostTextView);
this.hostText = findViewById(R.id.hostTextView);
hostText.setImeOptions(EditorInfo.IME_ACTION_DONE);
hostText.setOnEditorActionListener(new TextView.OnEditorActionListener() {
@Override

View File

@@ -0,0 +1,37 @@
package com.limelight.preferences;
import android.content.Context;
import android.content.SharedPreferences;
public class GlPreferences {
private static final String PREF_NAME = "GlPreferences";
private static final String FINGERPRINT_PREF_STRING = "Fingerprint";
private static final String GL_RENDERER_PREF_STRING = "Renderer";
private SharedPreferences prefs;
public String glRenderer;
public String savedFingerprint;
private GlPreferences(SharedPreferences prefs) {
this.prefs = prefs;
}
public static GlPreferences readPreferences(Context context) {
SharedPreferences prefs = context.getSharedPreferences(PREF_NAME, 0);
GlPreferences glPrefs = new GlPreferences(prefs);
glPrefs.glRenderer = prefs.getString(GL_RENDERER_PREF_STRING, "");
glPrefs.savedFingerprint = prefs.getString(FINGERPRINT_PREF_STRING, "");
return glPrefs;
}
public boolean writePreferences() {
return prefs.edit()
.putString(GL_RENDERER_PREF_STRING, glRenderer)
.putString(FINGERPRINT_PREF_STRING, savedFingerprint)
.commit();
}
}

View File

@@ -3,7 +3,6 @@ package com.limelight.preferences;
import android.content.Context;
import android.content.SharedPreferences;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.os.Build;
import android.preference.PreferenceManager;
@@ -23,6 +22,11 @@ public class PreferenceConfiguration {
private static final String USB_DRIVER_PREF_SRING = "checkbox_usb_driver";
private static final String VIDEO_FORMAT_PREF_STRING = "video_format";
private static final String ONSCREEN_CONTROLLER_PREF_STRING = "checkbox_show_onscreen_controls";
private static final String ONLY_L3_R3_PREF_STRING = "checkbox_only_show_L3R3";
private static final String BATTERY_SAVER_PREF_STRING = "checkbox_battery_saver";
private static final String DISABLE_FRAME_DROP_PREF_STRING = "checkbox_disable_frame_drop";
private static final String ENABLE_HDR_PREF_STRING = "checkbox_enable_hdr";
private static final String ENABLE_PIP_PREF_STRING = "checkbox_enable_pip";
private static final int BITRATE_DEFAULT_720_30 = 5;
private static final int BITRATE_DEFAULT_720_60 = 10;
@@ -45,6 +49,11 @@ public class PreferenceConfiguration {
private static final boolean DEFAULT_USB_DRIVER = true;
private static final String DEFAULT_VIDEO_FORMAT = "auto";
private static final boolean ONSCREEN_CONTROLLER_DEFAULT = false;
private static final boolean ONLY_L3_R3_DEFAULT = false;
private static final boolean DEFAULT_BATTERY_SAVER = false;
private static final boolean DEFAULT_DISABLE_FRAME_DROP = false;
private static final boolean DEFAULT_ENABLE_HDR = false;
private static final boolean DEFAULT_ENABLE_PIP = false;
public static final int FORCE_H265_ON = -1;
public static final int AUTOSELECT_H265 = 0;
@@ -58,6 +67,11 @@ public class PreferenceConfiguration {
public String language;
public boolean listMode, smallIconMode, multiController, enable51Surround, usbDriver;
public boolean onscreenController;
public boolean onlyL3R3;
public boolean batterySaver;
public boolean disableFrameDrop;
public boolean enableHdr;
public boolean enablePip;
public static int getDefaultBitrate(String resFpsString) {
if (resFpsString.equals("720p30")) {
@@ -128,6 +142,17 @@ public class PreferenceConfiguration {
}
}
public static void resetStreamingSettings(Context context) {
// We consider resolution, FPS, bitrate, HDR, and video format as "streaming settings" here
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
prefs.edit()
.remove(BITRATE_PREF_STRING)
.remove(RES_FPS_PREF_STRING)
.remove(VIDEO_FORMAT_PREF_STRING)
.remove(ENABLE_HDR_PREF_STRING)
.apply();
}
public static PreferenceConfiguration readPreferences(Context context) {
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
PreferenceConfiguration config = new PreferenceConfiguration();
@@ -188,6 +213,11 @@ public class PreferenceConfiguration {
config.enable51Surround = prefs.getBoolean(ENABLE_51_SURROUND_PREF_STRING, DEFAULT_ENABLE_51_SURROUND);
config.usbDriver = prefs.getBoolean(USB_DRIVER_PREF_SRING, DEFAULT_USB_DRIVER);
config.onscreenController = prefs.getBoolean(ONSCREEN_CONTROLLER_PREF_STRING, ONSCREEN_CONTROLLER_DEFAULT);
config.onlyL3R3 = prefs.getBoolean(ONLY_L3_R3_PREF_STRING, ONLY_L3_R3_DEFAULT);
config.batterySaver = prefs.getBoolean(BATTERY_SAVER_PREF_STRING, DEFAULT_BATTERY_SAVER);
config.disableFrameDrop = prefs.getBoolean(DISABLE_FRAME_DROP_PREF_STRING, DEFAULT_DISABLE_FRAME_DROP);
config.enableHdr = prefs.getBoolean(ENABLE_HDR_PREF_STRING, DEFAULT_ENABLE_HDR);
config.enablePip = prefs.getBoolean(ENABLE_PIP_PREF_STRING, DEFAULT_ENABLE_PIP);
return config;
}

View File

@@ -2,21 +2,25 @@ package com.limelight.preferences;
import android.content.Intent;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.media.MediaCodecInfo;
import android.os.Build;
import android.os.Bundle;
import android.app.Activity;
import android.preference.ListPreference;
import android.preference.Preference;
import android.preference.PreferenceCategory;
import android.preference.PreferenceFragment;
import android.preference.PreferenceManager;
import android.preference.PreferenceScreen;
import android.util.Range;
import android.view.Display;
import com.limelight.LimeLog;
import com.limelight.PcView;
import com.limelight.R;
import com.limelight.binding.video.MediaCodecHelper;
import com.limelight.utils.UiHelper;
import java.util.Locale;
public class StreamSettings extends Activity {
private PreferenceConfiguration previousPrefs;
@@ -26,11 +30,7 @@ public class StreamSettings extends Activity {
previousPrefs = PreferenceConfiguration.readPreferences(this);
if (!previousPrefs.language.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(getResources().getConfiguration());
config.locale = new Locale(previousPrefs.language);
getResources().updateConfiguration(config, getResources().getDisplayMetrics());
}
UiHelper.setLocale(this);
setContentView(R.layout.activity_stream_settings);
getFragmentManager().beginTransaction().replace(
@@ -57,6 +57,37 @@ public class StreamSettings extends Activity {
}
public static class SettingsFragment extends PreferenceFragment {
private static void removeResolution(ListPreference pref, String prefix) {
int matchingCount = 0;
// Count the number of matching entries we'll be removing
for (CharSequence seq : pref.getEntryValues()) {
if (seq.toString().startsWith(prefix)) {
matchingCount++;
}
}
// Create the new arrays
CharSequence[] entries = new CharSequence[pref.getEntries().length-matchingCount];
CharSequence[] entryValues = new CharSequence[pref.getEntryValues().length-matchingCount];
int outIndex = 0;
for (int i = 0; i < pref.getEntryValues().length; i++) {
if (pref.getEntryValues()[i].toString().startsWith(prefix)) {
// Skip matching prefixes
continue;
}
entries[outIndex] = pref.getEntries()[i];
entryValues[outIndex] = pref.getEntryValues()[i];
outIndex++;
}
// Update the preference with the new list
pref.setEntries(entries);
pref.setEntryValues(entryValues);
}
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
@@ -72,6 +103,130 @@ public class StreamSettings extends Activity {
screen.removePreference(category);
}
// Remove PiP mode on devices pre-Oreo
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.O) {
PreferenceCategory category =
(PreferenceCategory) findPreference("category_basic_settings");
category.removePreference(findPreference("checkbox_enable_pip"));
}
// Hide non-supported resolution/FPS combinations
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
Display display = getActivity().getWindowManager().getDefaultDisplay();
int maxSupportedResW = 0;
// Always allow resolutions that are smaller or equal to the active
// display resolution because decoders can report total non-sense to us.
// For example, a p201 device reports:
// AVC Decoder: OMX.amlogic.avc.decoder.awesome
// HEVC Decoder: OMX.amlogic.hevc.decoder.awesome
// AVC supported width range: 64 - 384
// HEVC supported width range: 64 - 544
for (Display.Mode candidate : display.getSupportedModes()) {
// Some devices report their dimensions in the portrait orientation
// where height > width. Normalize these to the conventional width > height
// arrangement before we process them.
int width = Math.max(candidate.getPhysicalWidth(), candidate.getPhysicalHeight());
int height = Math.min(candidate.getPhysicalWidth(), candidate.getPhysicalHeight());
if ((width >= 3840 || height >= 2160) && maxSupportedResW < 3840) {
maxSupportedResW = 3840;
}
else if ((width >= 1920 || height >= 1080) && maxSupportedResW < 1920) {
maxSupportedResW = 1920;
}
}
// This must be called to do runtime initialization before calling functions that evaluate
// decoder lists.
MediaCodecHelper.initialize(getContext(), GlPreferences.readPreferences(getContext()).glRenderer);
MediaCodecInfo avcDecoder = MediaCodecHelper.findProbableSafeDecoder("video/avc", -1);
MediaCodecInfo hevcDecoder = MediaCodecHelper.findProbableSafeDecoder("video/hevc", -1);
if (avcDecoder != null) {
Range<Integer> avcWidthRange = avcDecoder.getCapabilitiesForType("video/avc").getVideoCapabilities().getSupportedWidths();
LimeLog.info("AVC supported width range: "+avcWidthRange.getLower()+" - "+avcWidthRange.getUpper());
// If 720p is not reported as supported, ignore all results from this API
if (avcWidthRange.contains(1280)) {
if (avcWidthRange.contains(3840) && maxSupportedResW < 3840) {
maxSupportedResW = 3840;
}
else if (avcWidthRange.contains(1920) && maxSupportedResW < 1920) {
maxSupportedResW = 1920;
}
else if (maxSupportedResW < 1280) {
maxSupportedResW = 1280;
}
}
}
if (hevcDecoder != null) {
Range<Integer> hevcWidthRange = hevcDecoder.getCapabilitiesForType("video/hevc").getVideoCapabilities().getSupportedWidths();
LimeLog.info("HEVC supported width range: "+hevcWidthRange.getLower()+" - "+hevcWidthRange.getUpper());
// If 720p is not reported as supported, ignore all results from this API
if (hevcWidthRange.contains(1280)) {
if (hevcWidthRange.contains(3840) && maxSupportedResW < 3840) {
maxSupportedResW = 3840;
}
else if (hevcWidthRange.contains(1920) && maxSupportedResW < 1920) {
maxSupportedResW = 1920;
}
else if (maxSupportedResW < 1280) {
maxSupportedResW = 1280;
}
}
}
LimeLog.info("Maximum resolution slot: "+maxSupportedResW);
ListPreference resPref = (ListPreference) findPreference("list_resolution_fps");
if (maxSupportedResW != 0) {
if (maxSupportedResW < 3840) {
// 4K is unsupported
removeResolution(resPref, "4K");
}
if (maxSupportedResW < 1920) {
// 1080p is unsupported
removeResolution(resPref, "1080p");
}
// Never remove 720p
}
}
// Remove HDR preference for devices below Nougat
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
LimeLog.info("Excluding HDR toggle based on OS");
PreferenceCategory category =
(PreferenceCategory) findPreference("category_advanced_settings");
category.removePreference(findPreference("checkbox_enable_hdr"));
}
else {
Display display = getActivity().getWindowManager().getDefaultDisplay();
Display.HdrCapabilities hdrCaps = display.getHdrCapabilities();
// We must now ensure our display is compatible with HDR10
boolean foundHdr10 = false;
for (int hdrType : hdrCaps.getSupportedHdrTypes()) {
if (hdrType == Display.HdrCapabilities.HDR_TYPE_HDR10) {
foundHdr10 = true;
}
}
if (!foundHdr10) {
LimeLog.info("Excluding HDR toggle based on display capabilities");
PreferenceCategory category =
(PreferenceCategory) findPreference("category_advanced_settings");
category.removePreference(findPreference("checkbox_enable_hdr"));
}
}
// Add a listener to the FPS and resolution preference
// so the bitrate can be auto-adjusted
Preference pref = findPreference(PreferenceConfiguration.RES_FPS_PREF_STRING);

View File

@@ -3,6 +3,6 @@ package com.limelight.ui;
import android.widget.AbsListView;
public interface AdapterFragmentCallbacks {
public int getAdapterFragmentLayoutId();
public void receiveAbsListView(AbsListView gridView);
int getAdapterFragmentLayoutId();
void receiveAbsListView(AbsListView gridView);
}

View File

@@ -1,5 +1,5 @@
package com.limelight.ui;
public interface GameGestures {
public void showKeyboard();
void showKeyboard();
}

View File

@@ -13,18 +13,18 @@ public class Dialog implements Runnable {
private final String title;
private final String message;
private final Activity activity;
private final boolean endAfterDismiss;
private final Runnable runOnDismiss;
private AlertDialog alert;
private static final ArrayList<Dialog> rundownDialogs = new ArrayList<>();
private Dialog(Activity activity, String title, String message, boolean endAfterDismiss)
private Dialog(Activity activity, String title, String message, Runnable runOnDismiss)
{
this.activity = activity;
this.title = title;
this.message = message;
this.endAfterDismiss = endAfterDismiss;
this.runOnDismiss = runOnDismiss;
}
public static void closeDialogs()
@@ -40,9 +40,21 @@ public class Dialog implements Runnable {
}
}
public static void displayDialog(Activity activity, String title, String message, boolean endAfterDismiss)
public static void displayDialog(final Activity activity, String title, String message, final boolean endAfterDismiss)
{
activity.runOnUiThread(new Dialog(activity, title, message, endAfterDismiss));
activity.runOnUiThread(new Dialog(activity, title, message, new Runnable() {
@Override
public void run() {
if (endAfterDismiss) {
activity.finish();
}
}
}));
}
public static void displayDialog(Activity activity, String title, String message, Runnable runOnDismiss)
{
activity.runOnUiThread(new Dialog(activity, title, message, runOnDismiss));
}
@Override
@@ -65,9 +77,7 @@ public class Dialog implements Runnable {
alert.dismiss();
}
if (endAfterDismiss) {
activity.finish();
}
runOnDismiss.run();
}
});
alert.setButton(AlertDialog.BUTTON_NEUTRAL, activity.getResources().getText(R.string.help), new DialogInterface.OnClickListener() {
@@ -77,9 +87,7 @@ public class Dialog implements Runnable {
alert.dismiss();
}
if (endAfterDismiss) {
activity.finish();
}
runOnDismiss.run();
HelpLauncher.launchTroubleshooting(activity);
}

View File

@@ -1,24 +1,43 @@
package com.limelight.utils;
import android.content.ActivityNotFoundException;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.net.Uri;
import android.os.Build;
import com.limelight.HelpActivity;
public class HelpLauncher {
private static boolean isKnownBrowser(Context context, Intent i) {
ResolveInfo resolvedActivity = context.getPackageManager().resolveActivity(i, PackageManager.MATCH_DEFAULT_ONLY);
if (resolvedActivity == null) {
// No browser
return false;
}
String name = resolvedActivity.activityInfo.name;
if (name == null) {
return false;
}
name = name.toLowerCase();
return name.contains("chrome") || name.contains("firefox");
}
private static void launchUrl(Context context, String url) {
// Try to launch the default browser
try {
// Fire TV devices will lie and say they do have a browser
// even though the OS just shows an error dialog if we
// try to use it.
if (!"Amazon".equalsIgnoreCase(Build.MANUFACTURER)) {
Intent i = new Intent(Intent.ACTION_VIEW);
i.setData(Uri.parse(url));
Intent i = new Intent(Intent.ACTION_VIEW);
i.setData(Uri.parse(url));
// Several Android TV devices will lie and say they do have a browser
// even though the OS just shows an error dialog if we try to use it. We need to
// be a bit more clever on these devices and detect if the browser is a legitimate
// browser or just a fake error message activity.
if (!context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_LEANBACK) ||
isKnownBrowser(context, i)) {
context.startActivity(i);
return;
}

View File

@@ -14,23 +14,21 @@ import com.limelight.nvstream.http.NvApp;
import com.limelight.nvstream.http.NvHTTP;
import java.io.FileNotFoundException;
import java.net.InetAddress;
import java.net.UnknownHostException;
public class ServerHelper {
public static InetAddress getCurrentAddressFromComputer(ComputerDetails computer) {
public static String getCurrentAddressFromComputer(ComputerDetails computer) {
return computer.reachability == ComputerDetails.Reachability.LOCAL ?
computer.localIp : computer.remoteIp;
computer.localAddress : computer.remoteAddress;
}
public static Intent createStartIntent(Activity parent, NvApp app, ComputerDetails computer,
ComputerManagerService.ComputerManagerBinder managerBinder) {
Intent intent = new Intent(parent, Game.class);
intent.putExtra(Game.EXTRA_HOST,
computer.reachability == ComputerDetails.Reachability.LOCAL ?
computer.localIp.getHostAddress() : computer.remoteIp.getHostAddress());
intent.putExtra(Game.EXTRA_HOST, getCurrentAddressFromComputer(computer));
intent.putExtra(Game.EXTRA_APP_NAME, app.getAppName());
intent.putExtra(Game.EXTRA_APP_ID, app.getAppId());
intent.putExtra(Game.EXTRA_APP_HDR, app.isHdrSupported());
intent.putExtra(Game.EXTRA_UNIQUEID, managerBinder.getUniqueId());
intent.putExtra(Game.EXTRA_STREAMING_REMOTE,
computer.reachability != ComputerDetails.Reachability.LOCAL);
@@ -45,7 +43,7 @@ public class ServerHelper {
}
public static void doQuit(final Activity parent,
final InetAddress address,
final String address,
final NvApp app,
final ComputerManagerService.ComputerManagerBinder managerBinder,
final Runnable onComplete) {

View File

@@ -5,10 +5,14 @@ import android.app.AlertDialog;
import android.app.UiModeManager;
import android.content.Context;
import android.content.DialogInterface;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.view.View;
import com.limelight.R;
import com.limelight.preferences.PreferenceConfiguration;
import java.util.Locale;
public class UiHelper {
@@ -16,6 +20,28 @@ public class UiHelper {
private static final int TV_VERTICAL_PADDING_DP = 27;
private static final int TV_HORIZONTAL_PADDING_DP = 48;
public static void setLocale(Activity activity)
{
String locale = PreferenceConfiguration.readPreferences(activity).language;
if (!locale.equals(PreferenceConfiguration.DEFAULT_LANGUAGE)) {
Configuration config = new Configuration(activity.getResources().getConfiguration());
// Some locales include both language and country which must be separated
// before calling the Locale constructor.
if (locale.contains("-"))
{
config.locale = new Locale(locale.substring(0, locale.indexOf('-')),
locale.substring(locale.indexOf('-') + 1));
}
else
{
config.locale = new Locale(locale);
}
activity.getResources().updateConfiguration(config, activity.getResources().getDisplayMetrics());
}
}
public static void notifyNewRootView(Activity activity)
{
View rootView = activity.findViewById(android.R.id.content);
@@ -33,6 +59,44 @@ public class UiHelper {
}
}
public static void showDecoderCrashDialog(Activity activity) {
final SharedPreferences prefs = activity.getSharedPreferences("DecoderTombstone", 0);
final int crashCount = prefs.getInt("CrashCount", 0);
int lastNotifiedCrashCount = prefs.getInt("LastNotifiedCrashCount", 0);
// Remember the last crash count we notified at, so we don't
// display the crash dialog every time the app is started until
// they stream again
if (crashCount != 0 && crashCount != lastNotifiedCrashCount) {
if (crashCount % 3 == 0) {
// At 3 consecutive crashes, we'll forcefully reset their settings
PreferenceConfiguration.resetStreamingSettings(activity);
Dialog.displayDialog(activity,
activity.getResources().getString(R.string.title_decoding_reset),
activity.getResources().getString(R.string.message_decoding_reset),
new Runnable() {
@Override
public void run() {
// Mark notification as acknowledged on dismissal
prefs.edit().putInt("LastNotifiedCrashCount", crashCount).apply();
}
});
}
else {
Dialog.displayDialog(activity,
activity.getResources().getString(R.string.title_decoding_error),
activity.getResources().getString(R.string.message_decoding_error),
new Runnable() {
@Override
public void run() {
// Mark notification as acknowledged on dismissal
prefs.edit().putInt("LastNotifiedCrashCount", crashCount).apply();
}
});
}
}
}
public static void displayQuitConfirmationDialog(Activity parent, final Runnable onYes, final Runnable onNo) {
DialogInterface.OnClickListener dialogClickListener = new DialogInterface.OnClickListener() {
@Override

View File

@@ -0,0 +1,47 @@
package com.limelight.utils;
public class Vector2d {
private float x;
private float y;
private double magnitude;
public static final Vector2d ZERO = new Vector2d();
public Vector2d() {
initialize(0, 0);
}
public void initialize(float x, float y) {
this.x = x;
this.y = y;
this.magnitude = Math.sqrt(Math.pow(x, 2) + Math.pow(y, 2));
}
public double getMagnitude() {
return magnitude;
}
public void getNormalized(Vector2d vector) {
vector.initialize((float)(x / magnitude), (float)(y / magnitude));
}
public void scalarMultiply(double factor) {
initialize((float)(x * factor), (float)(y * factor));
}
public void setX(float x) {
initialize(x, this.y);
}
public void setY(float y) {
initialize(this.x, y);
}
public float getX() {
return x;
}
public float getY() {
return y;
}
}

View File

@@ -1,10 +1,4 @@
# Application.mk for Limelight
# Application.mk for Moonlight
# Our minimum version is Android 4.1
APP_PLATFORM := android-16
# Support all modern ABIs
APP_ABI := armeabi-v7a arm64-v8a x86 x86_64 mips mips64
# We want an optimized build
APP_OPTIM := release

View File

@@ -5,28 +5,32 @@ include $(call all-subdir-makefiles)
LOCAL_PATH := $(MY_LOCAL_PATH)
include $(CLEAR_VARS)
LOCAL_MODULE := evdev_reader
LOCAL_SRC_FILES := evdev_reader.c
LOCAL_LDLIBS := -llog
# Only build evdev_reader for the rooted APK flavor
ifeq (root,$(PRODUCT_FLAVOR))
include $(CLEAR_VARS)
LOCAL_MODULE := evdev_reader
LOCAL_SRC_FILES := evdev_reader.c
LOCAL_LDLIBS := -llog
# This next portion of the makefile is mostly copied from build-executable.mk but
# creates a binary with the libXXX.so form so the APK will install and drop
# the binary correctly.
# This next portion of the makefile is mostly copied from build-executable.mk but
# creates a binary with the libXXX.so form so the APK will install and drop
# the binary correctly.
LOCAL_BUILD_SCRIPT := BUILD_EXECUTABLE
LOCAL_MAKEFILE := $(local-makefile)
LOCAL_BUILD_SCRIPT := BUILD_EXECUTABLE
LOCAL_MAKEFILE := $(local-makefile)
$(call check-defined-LOCAL_MODULE,$(LOCAL_BUILD_SCRIPT))
$(call check-LOCAL_MODULE,$(LOCAL_MAKEFILE))
$(call check-LOCAL_MODULE_FILENAME)
$(call check-defined-LOCAL_MODULE,$(LOCAL_BUILD_SCRIPT))
$(call check-LOCAL_MODULE,$(LOCAL_MAKEFILE))
$(call check-LOCAL_MODULE_FILENAME)
# we are building target objects
my := TARGET_
# we are building target objects
my := TARGET_
$(call handle-module-filename,lib,$(TARGET_SONAME_EXTENSION))
$(call handle-module-built)
$(call handle-module-filename,lib,$(TARGET_SONAME_EXTENSION))
$(call handle-module-built)
LOCAL_MODULE_CLASS := EXECUTABLE
include $(BUILD_SYSTEM)/build-module.mk
endif
LOCAL_MODULE_CLASS := EXECUTABLE
include $(BUILD_SYSTEM)/build-module.mk

View File

@@ -1,25 +0,0 @@
# Android.mk for Moonlight's ENet JNI binding
MY_LOCAL_PATH := $(call my-dir)
include $(call all-subdir-makefiles)
LOCAL_PATH := $(MY_LOCAL_PATH)
include $(CLEAR_VARS)
LOCAL_MODULE := jnienet
LOCAL_SRC_FILES := jnienet.c \
enet/callbacks.c \
enet/compress.c \
enet/host.c \
enet/list.c \
enet/packet.c \
enet/peer.c \
enet/protocol.c \
enet/unix.c \
enet/win32.c \
LOCAL_CFLAGS := -DHAS_SOCKLEN_T=1
LOCAL_C_INCLUDES := $(LOCAL_PATH)/enet/include
include $(BUILD_SHARED_LIBRARY)

View File

@@ -1,148 +0,0 @@
#include "enet/enet.h"
#include <stdlib.h>
#include <string.h>
#include <jni.h>
#define CLIENT_TO_LONG(x) ((intptr_t)(x))
#define LONG_TO_CLIENT(x) ((ENetHost*)(intptr_t)(x))
#define PEER_TO_LONG(x) ((intptr_t)(x))
#define LONG_TO_PEER(x) ((ENetPeer*)(intptr_t)(x))
JNIEXPORT jint JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_initializeEnet(JNIEnv *env, jobject class) {
return enet_initialize();
}
JNIEXPORT jlong JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_createClient(JNIEnv *env, jobject class, jstring address) {
ENetAddress enetAddress;
const char *addrStr;
int err;
// Perform a lookup on the address to determine the address family
addrStr = (*env)->GetStringUTFChars(env, address, 0);
err = enet_address_set_host(&enetAddress, addrStr);
(*env)->ReleaseStringUTFChars(env, address, addrStr);
if (err < 0) {
return CLIENT_TO_LONG(NULL);
}
// Create a client that can use 1 outgoing connection and 1 channel
return CLIENT_TO_LONG(enet_host_create(enetAddress.address.ss_family, NULL, 1, 1, 0, 0));
}
JNIEXPORT jlong JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_connectToPeer(JNIEnv *env, jobject class, jlong client, jstring address, jint port, jint timeout) {
ENetPeer* peer;
ENetAddress enetAddress;
ENetEvent event;
const char *addrStr;
int err;
// Initialize the ENet address
addrStr = (*env)->GetStringUTFChars(env, address, 0);
err = enet_address_set_host(&enetAddress, addrStr);
enet_address_set_port(&enetAddress, port);
(*env)->ReleaseStringUTFChars(env, address, addrStr);
if (err < 0) {
return PEER_TO_LONG(NULL);
}
// Start the connection
peer = enet_host_connect(LONG_TO_CLIENT(client), &enetAddress, 1, 0);
if (peer == NULL) {
return PEER_TO_LONG(NULL);
}
// Wait for the connect to complete
if (enet_host_service(LONG_TO_CLIENT(client), &event, timeout) <= 0 || event.type != ENET_EVENT_TYPE_CONNECT) {
enet_peer_reset(peer);
return PEER_TO_LONG(NULL);
}
// Ensure the connect verify ACK is sent immediately
enet_host_flush(LONG_TO_CLIENT(client));
// Set the max peer timeout to 10 seconds
enet_peer_timeout(peer, ENET_PEER_TIMEOUT_LIMIT, ENET_PEER_TIMEOUT_MINIMUM, 10000);
return PEER_TO_LONG(peer);
}
JNIEXPORT jint JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_readPacket(JNIEnv *env, jobject class, jlong client, jbyteArray data, jint length, jint timeout) {
jint err;
jbyte* dataPtr;
ENetEvent event;
// Wait for a receive event, timeout, or disconnect
err = enet_host_service(LONG_TO_CLIENT(client), &event, timeout);
if (err <= 0) {
return err;
}
else if (event.type != ENET_EVENT_TYPE_RECEIVE) {
return -1;
}
// Check that the packet isn't too large
if (event.packet->dataLength > length) {
enet_packet_destroy(event.packet);
return event.packet->dataLength;
}
// Copy the packet data into the caller's buffer
dataPtr = (*env)->GetByteArrayElements(env, data, 0);
memcpy(dataPtr, event.packet->data, event.packet->dataLength);
err = event.packet->dataLength;
(*env)->ReleaseByteArrayElements(env, data, dataPtr, 0);
// Free the packet
enet_packet_destroy(event.packet);
return err;
}
JNIEXPORT jboolean JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_writePacket(JNIEnv *env, jobject class, jlong client, jlong peer, jbyteArray data, jint length, jint packetFlags) {
ENetPacket* packet;
jboolean ret;
jbyte* dataPtr;
dataPtr = (*env)->GetByteArrayElements(env, data, 0);
// Create the reliable packet that describes our outgoing message
packet = enet_packet_create(dataPtr, length, packetFlags);
if (packet != NULL) {
// Send the message to the peer
if (enet_peer_send(LONG_TO_PEER(peer), 0, packet) < 0) {
// This can fail if the peer has been disconnected
enet_packet_destroy(packet);
ret = JNI_FALSE;
}
else {
// Force the client to send the packet now
enet_host_flush(LONG_TO_CLIENT(client));
ret = JNI_TRUE;
}
}
else {
ret = JNI_FALSE;
}
(*env)->ReleaseByteArrayElements(env, data, dataPtr, JNI_ABORT);
return ret;
}
JNIEXPORT void JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_destroyClient(JNIEnv *env, jobject class, jlong client) {
enet_host_destroy(LONG_TO_CLIENT(client));
}
JNIEXPORT void JNICALL
Java_com_limelight_nvstream_enet_EnetConnection_disconnectPeer(JNIEnv *env, jobject class, jlong peer) {
enet_peer_disconnect_now(LONG_TO_PEER(peer), 0);
}

View File

@@ -1,16 +0,0 @@
# Android.mk for Limelight's Opus decoder
MY_LOCAL_PATH := $(call my-dir)
include $(call all-subdir-makefiles)
LOCAL_PATH := $(MY_LOCAL_PATH)
include $(CLEAR_VARS)
LOCAL_MODULE := nv_opus_dec
LOCAL_SRC_FILES := nv_opus_dec.c nv_opus_dec_jni.c
LOCAL_C_INCLUDES := $(LOCAL_PATH)/libopus/inc
# Link to libopus library
LOCAL_STATIC_LIBRARIES := libopus
include $(BUILD_SHARED_LIBRARY)

View File

@@ -1,7 +0,0 @@
LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE:= libopus
LOCAL_SRC_FILES:= $(TARGET_ARCH_ABI)/libopus.a
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/inc
include $(PREBUILT_STATIC_LIBRARY)

View File

@@ -1,100 +0,0 @@
ANDROID_API_TARGET=21
PARALLEL_JOBS=$(nproc)
rm -r ./android
mkdir android
function build_one
{
PREFIX=$(pwd)/android/$CPU
SYSROOT=$NDK/platforms/android-$ANDROID_API_TARGET/arch-$SYSROOT_CPU
TOOLCHAIN_PATH=$NDK/toolchains/$TOOLCHAIN_DIR/prebuilt/linux-x86_64
export PATH=$PATH:$TOOLCHAIN_PATH/bin
./configure \
--build=x86_64-unknown-linux-gnu \
--host=$TOOLCHAIN_BIN_PREFIX \
--target=$TOOLCHAIN_BIN_PREFIX \
CFLAGS="--sysroot=$SYSROOT -O2 $ADDI_CFLAGS" \
$ADDI_CONFIGURE_FLAGS
make clean
make -j$PARALLEL_JOBS
mkdir android/$CPU
cp .libs/libopus.a android/$CPU
}
function build_mips
{
CPU=mips
SYSROOT_CPU=mips
TOOLCHAIN_BIN_PREFIX=mipsel-linux-android
TOOLCHAIN_DIR=mipsel-linux-android-4.9
ADDI_CFLAGS="-mips32 -mhard-float -EL -mno-dsp"
ADDI_CONFIGURE_FLAGS="--enable-fixed-point" # fixed point
build_one
}
function build_mips64
{
CPU=mips64
SYSROOT_CPU=mips64
TOOLCHAIN_BIN_PREFIX=mips64el-linux-android
TOOLCHAIN_DIR=mips64el-linux-android-4.9
ADDI_CFLAGS="-mips64r6"
ADDI_CONFIGURE_FLAGS="--enable-fixed-point" # fixed point
build_one
}
function build_x86
{
CPU=x86
SYSROOT_CPU=x86
TOOLCHAIN_BIN_PREFIX=i686-linux-android
TOOLCHAIN_DIR=x86-4.9
ADDI_CFLAGS="-march=i686 -mtune=atom -mstackrealign -msse -msse2 -msse3 -mssse3 -mfpmath=sse -m32"
ADDI_CONFIGURE_FLAGS="" # floating point for SSE optimizations
build_one
}
function build_x86_64
{
CPU=x86_64
SYSROOT_CPU=x86_64
TOOLCHAIN_BIN_PREFIX=x86_64-linux-android
TOOLCHAIN_DIR=x86_64-4.9
ADDI_CFLAGS="-msse -msse2 -msse3 -mssse3 -msse4 -msse4.1 -msse4.2 -mpopcnt -m64"
ADDI_CONFIGURE_FLAGS="" # floating point for SSE optimizations
build_one
}
function build_armv7
{
CPU=arm
SYSROOT_CPU=arm
TOOLCHAIN_BIN_PREFIX=arm-linux-androideabi
TOOLCHAIN_DIR=arm-linux-androideabi-4.9
ADDI_CFLAGS="-marm -mfpu=vfpv3-d16"
ADDI_LDFLAGS=""
ADDI_CONFIGURE_FLAGS="--enable-fixed-point" # fixed point for NEON, EDSP, Media
build_one
}
# ARMv8 doesn't currently have assembly in the opus project. We still use fixed point
# anyway in the hopes that it will be more performant even without assembly.
function build_armv8
{
CPU=aarch64
SYSROOT_CPU=arm64
TOOLCHAIN_BIN_PREFIX=aarch64-linux-android
TOOLCHAIN_DIR=aarch64-linux-android-4.9
ADDI_CFLAGS=""
ADDI_LDFLAGS=""
ADDI_CONFIGURE_FLAGS="--enable-fixed-point"
build_one
}
build_mips
build_mips64
build_x86
build_x86_64
build_armv7
build_armv8

View File

@@ -1,978 +0,0 @@
/* Copyright (c) 2010-2011 Xiph.Org Foundation, Skype Limited
Written by Jean-Marc Valin and Koen Vos */
/*
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**
* @file opus.h
* @brief Opus reference implementation API
*/
#ifndef OPUS_H
#define OPUS_H
#include "opus_types.h"
#include "opus_defines.h"
#ifdef __cplusplus
extern "C" {
#endif
/**
* @mainpage Opus
*
* The Opus codec is designed for interactive speech and audio transmission over the Internet.
* It is designed by the IETF Codec Working Group and incorporates technology from
* Skype's SILK codec and Xiph.Org's CELT codec.
*
* The Opus codec is designed to handle a wide range of interactive audio applications,
* including Voice over IP, videoconferencing, in-game chat, and even remote live music
* performances. It can scale from low bit-rate narrowband speech to very high quality
* stereo music. Its main features are:
* @li Sampling rates from 8 to 48 kHz
* @li Bit-rates from 6 kb/s to 510 kb/s
* @li Support for both constant bit-rate (CBR) and variable bit-rate (VBR)
* @li Audio bandwidth from narrowband to full-band
* @li Support for speech and music
* @li Support for mono and stereo
* @li Support for multichannel (up to 255 channels)
* @li Frame sizes from 2.5 ms to 60 ms
* @li Good loss robustness and packet loss concealment (PLC)
* @li Floating point and fixed-point implementation
*
* Documentation sections:
* @li @ref opus_encoder
* @li @ref opus_decoder
* @li @ref opus_repacketizer
* @li @ref opus_multistream
* @li @ref opus_libinfo
* @li @ref opus_custom
*/
/** @defgroup opus_encoder Opus Encoder
* @{
*
* @brief This page describes the process and functions used to encode Opus.
*
* Since Opus is a stateful codec, the encoding process starts with creating an encoder
* state. This can be done with:
*
* @code
* int error;
* OpusEncoder *enc;
* enc = opus_encoder_create(Fs, channels, application, &error);
* @endcode
*
* From this point, @c enc can be used for encoding an audio stream. An encoder state
* @b must @b not be used for more than one stream at the same time. Similarly, the encoder
* state @b must @b not be re-initialized for each frame.
*
* While opus_encoder_create() allocates memory for the state, it's also possible
* to initialize pre-allocated memory:
*
* @code
* int size;
* int error;
* OpusEncoder *enc;
* size = opus_encoder_get_size(channels);
* enc = malloc(size);
* error = opus_encoder_init(enc, Fs, channels, application);
* @endcode
*
* where opus_encoder_get_size() returns the required size for the encoder state. Note that
* future versions of this code may change the size, so no assuptions should be made about it.
*
* The encoder state is always continuous in memory and only a shallow copy is sufficient
* to copy it (e.g. memcpy())
*
* It is possible to change some of the encoder's settings using the opus_encoder_ctl()
* interface. All these settings already default to the recommended value, so they should
* only be changed when necessary. The most common settings one may want to change are:
*
* @code
* opus_encoder_ctl(enc, OPUS_SET_BITRATE(bitrate));
* opus_encoder_ctl(enc, OPUS_SET_COMPLEXITY(complexity));
* opus_encoder_ctl(enc, OPUS_SET_SIGNAL(signal_type));
* @endcode
*
* where
*
* @arg bitrate is in bits per second (b/s)
* @arg complexity is a value from 1 to 10, where 1 is the lowest complexity and 10 is the highest
* @arg signal_type is either OPUS_AUTO (default), OPUS_SIGNAL_VOICE, or OPUS_SIGNAL_MUSIC
*
* See @ref opus_encoderctls and @ref opus_genericctls for a complete list of parameters that can be set or queried. Most parameters can be set or changed at any time during a stream.
*
* To encode a frame, opus_encode() or opus_encode_float() must be called with exactly one frame (2.5, 5, 10, 20, 40 or 60 ms) of audio data:
* @code
* len = opus_encode(enc, audio_frame, frame_size, packet, max_packet);
* @endcode
*
* where
* <ul>
* <li>audio_frame is the audio data in opus_int16 (or float for opus_encode_float())</li>
* <li>frame_size is the duration of the frame in samples (per channel)</li>
* <li>packet is the byte array to which the compressed data is written</li>
* <li>max_packet is the maximum number of bytes that can be written in the packet (4000 bytes is recommended).
* Do not use max_packet to control VBR target bitrate, instead use the #OPUS_SET_BITRATE CTL.</li>
* </ul>
*
* opus_encode() and opus_encode_float() return the number of bytes actually written to the packet.
* The return value <b>can be negative</b>, which indicates that an error has occurred. If the return value
* is 1 byte, then the packet does not need to be transmitted (DTX).
*
* Once the encoder state if no longer needed, it can be destroyed with
*
* @code
* opus_encoder_destroy(enc);
* @endcode
*
* If the encoder was created with opus_encoder_init() rather than opus_encoder_create(),
* then no action is required aside from potentially freeing the memory that was manually
* allocated for it (calling free(enc) for the example above)
*
*/
/** Opus encoder state.
* This contains the complete state of an Opus encoder.
* It is position independent and can be freely copied.
* @see opus_encoder_create,opus_encoder_init
*/
typedef struct OpusEncoder OpusEncoder;
/** Gets the size of an <code>OpusEncoder</code> structure.
* @param[in] channels <tt>int</tt>: Number of channels.
* This must be 1 or 2.
* @returns The size in bytes.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_encoder_get_size(int channels);
/**
*/
/** Allocates and initializes an encoder state.
* There are three coding modes:
*
* @ref OPUS_APPLICATION_VOIP gives best quality at a given bitrate for voice
* signals. It enhances the input signal by high-pass filtering and
* emphasizing formants and harmonics. Optionally it includes in-band
* forward error correction to protect against packet loss. Use this
* mode for typical VoIP applications. Because of the enhancement,
* even at high bitrates the output may sound different from the input.
*
* @ref OPUS_APPLICATION_AUDIO gives best quality at a given bitrate for most
* non-voice signals like music. Use this mode for music and mixed
* (music/voice) content, broadcast, and applications requiring less
* than 15 ms of coding delay.
*
* @ref OPUS_APPLICATION_RESTRICTED_LOWDELAY configures low-delay mode that
* disables the speech-optimized mode in exchange for slightly reduced delay.
* This mode can only be set on an newly initialized or freshly reset encoder
* because it changes the codec delay.
*
* This is useful when the caller knows that the speech-optimized modes will not be needed (use with caution).
* @param [in] Fs <tt>opus_int32</tt>: Sampling rate of input signal (Hz)
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param [in] channels <tt>int</tt>: Number of channels (1 or 2) in input signal
* @param [in] application <tt>int</tt>: Coding mode (@ref OPUS_APPLICATION_VOIP/@ref OPUS_APPLICATION_AUDIO/@ref OPUS_APPLICATION_RESTRICTED_LOWDELAY)
* @param [out] error <tt>int*</tt>: @ref opus_errorcodes
* @note Regardless of the sampling rate and number channels selected, the Opus encoder
* can switch to a lower audio bandwidth or number of channels if the bitrate
* selected is too low. This also means that it is safe to always use 48 kHz stereo input
* and let the encoder optimize the encoding.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusEncoder *opus_encoder_create(
opus_int32 Fs,
int channels,
int application,
int *error
);
/** Initializes a previously allocated encoder state
* The memory pointed to by st must be at least the size returned by opus_encoder_get_size().
* This is intended for applications which use their own allocator instead of malloc.
* @see opus_encoder_create(),opus_encoder_get_size()
* To reset a previously initialized state, use the #OPUS_RESET_STATE CTL.
* @param [in] st <tt>OpusEncoder*</tt>: Encoder state
* @param [in] Fs <tt>opus_int32</tt>: Sampling rate of input signal (Hz)
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param [in] channels <tt>int</tt>: Number of channels (1 or 2) in input signal
* @param [in] application <tt>int</tt>: Coding mode (OPUS_APPLICATION_VOIP/OPUS_APPLICATION_AUDIO/OPUS_APPLICATION_RESTRICTED_LOWDELAY)
* @retval #OPUS_OK Success or @ref opus_errorcodes
*/
OPUS_EXPORT int opus_encoder_init(
OpusEncoder *st,
opus_int32 Fs,
int channels,
int application
) OPUS_ARG_NONNULL(1);
/** Encodes an Opus frame.
* @param [in] st <tt>OpusEncoder*</tt>: Encoder state
* @param [in] pcm <tt>opus_int16*</tt>: Input signal (interleaved if 2 channels). length is frame_size*channels*sizeof(opus_int16)
* @param [in] frame_size <tt>int</tt>: Number of samples per channel in the
* input signal.
* This must be an Opus frame size for
* the encoder's sampling rate.
* For example, at 48 kHz the permitted
* values are 120, 240, 480, 960, 1920,
* and 2880.
* Passing in a duration of less than
* 10 ms (480 samples at 48 kHz) will
* prevent the encoder from using the LPC
* or hybrid modes.
* @param [out] data <tt>unsigned char*</tt>: Output payload.
* This must contain storage for at
* least \a max_data_bytes.
* @param [in] max_data_bytes <tt>opus_int32</tt>: Size of the allocated
* memory for the output
* payload. This may be
* used to impose an upper limit on
* the instant bitrate, but should
* not be used as the only bitrate
* control. Use #OPUS_SET_BITRATE to
* control the bitrate.
* @returns The length of the encoded packet (in bytes) on success or a
* negative error code (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_encode(
OpusEncoder *st,
const opus_int16 *pcm,
int frame_size,
unsigned char *data,
opus_int32 max_data_bytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Encodes an Opus frame from floating point input.
* @param [in] st <tt>OpusEncoder*</tt>: Encoder state
* @param [in] pcm <tt>float*</tt>: Input in float format (interleaved if 2 channels), with a normal range of +/-1.0.
* Samples with a range beyond +/-1.0 are supported but will
* be clipped by decoders using the integer API and should
* only be used if it is known that the far end supports
* extended dynamic range.
* length is frame_size*channels*sizeof(float)
* @param [in] frame_size <tt>int</tt>: Number of samples per channel in the
* input signal.
* This must be an Opus frame size for
* the encoder's sampling rate.
* For example, at 48 kHz the permitted
* values are 120, 240, 480, 960, 1920,
* and 2880.
* Passing in a duration of less than
* 10 ms (480 samples at 48 kHz) will
* prevent the encoder from using the LPC
* or hybrid modes.
* @param [out] data <tt>unsigned char*</tt>: Output payload.
* This must contain storage for at
* least \a max_data_bytes.
* @param [in] max_data_bytes <tt>opus_int32</tt>: Size of the allocated
* memory for the output
* payload. This may be
* used to impose an upper limit on
* the instant bitrate, but should
* not be used as the only bitrate
* control. Use #OPUS_SET_BITRATE to
* control the bitrate.
* @returns The length of the encoded packet (in bytes) on success or a
* negative error code (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_encode_float(
OpusEncoder *st,
const float *pcm,
int frame_size,
unsigned char *data,
opus_int32 max_data_bytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Frees an <code>OpusEncoder</code> allocated by opus_encoder_create().
* @param[in] st <tt>OpusEncoder*</tt>: State to be freed.
*/
OPUS_EXPORT void opus_encoder_destroy(OpusEncoder *st);
/** Perform a CTL function on an Opus encoder.
*
* Generally the request and subsequent arguments are generated
* by a convenience macro.
* @param st <tt>OpusEncoder*</tt>: Encoder state.
* @param request This and all remaining parameters should be replaced by one
* of the convenience macros in @ref opus_genericctls or
* @ref opus_encoderctls.
* @see opus_genericctls
* @see opus_encoderctls
*/
OPUS_EXPORT int opus_encoder_ctl(OpusEncoder *st, int request, ...) OPUS_ARG_NONNULL(1);
/**@}*/
/** @defgroup opus_decoder Opus Decoder
* @{
*
* @brief This page describes the process and functions used to decode Opus.
*
* The decoding process also starts with creating a decoder
* state. This can be done with:
* @code
* int error;
* OpusDecoder *dec;
* dec = opus_decoder_create(Fs, channels, &error);
* @endcode
* where
* @li Fs is the sampling rate and must be 8000, 12000, 16000, 24000, or 48000
* @li channels is the number of channels (1 or 2)
* @li error will hold the error code in case of failure (or #OPUS_OK on success)
* @li the return value is a newly created decoder state to be used for decoding
*
* While opus_decoder_create() allocates memory for the state, it's also possible
* to initialize pre-allocated memory:
* @code
* int size;
* int error;
* OpusDecoder *dec;
* size = opus_decoder_get_size(channels);
* dec = malloc(size);
* error = opus_decoder_init(dec, Fs, channels);
* @endcode
* where opus_decoder_get_size() returns the required size for the decoder state. Note that
* future versions of this code may change the size, so no assuptions should be made about it.
*
* The decoder state is always continuous in memory and only a shallow copy is sufficient
* to copy it (e.g. memcpy())
*
* To decode a frame, opus_decode() or opus_decode_float() must be called with a packet of compressed audio data:
* @code
* frame_size = opus_decode(dec, packet, len, decoded, max_size, 0);
* @endcode
* where
*
* @li packet is the byte array containing the compressed data
* @li len is the exact number of bytes contained in the packet
* @li decoded is the decoded audio data in opus_int16 (or float for opus_decode_float())
* @li max_size is the max duration of the frame in samples (per channel) that can fit into the decoded_frame array
*
* opus_decode() and opus_decode_float() return the number of samples (per channel) decoded from the packet.
* If that value is negative, then an error has occurred. This can occur if the packet is corrupted or if the audio
* buffer is too small to hold the decoded audio.
*
* Opus is a stateful codec with overlapping blocks and as a result Opus
* packets are not coded independently of each other. Packets must be
* passed into the decoder serially and in the correct order for a correct
* decode. Lost packets can be replaced with loss concealment by calling
* the decoder with a null pointer and zero length for the missing packet.
*
* A single codec state may only be accessed from a single thread at
* a time and any required locking must be performed by the caller. Separate
* streams must be decoded with separate decoder states and can be decoded
* in parallel unless the library was compiled with NONTHREADSAFE_PSEUDOSTACK
* defined.
*
*/
/** Opus decoder state.
* This contains the complete state of an Opus decoder.
* It is position independent and can be freely copied.
* @see opus_decoder_create,opus_decoder_init
*/
typedef struct OpusDecoder OpusDecoder;
/** Gets the size of an <code>OpusDecoder</code> structure.
* @param [in] channels <tt>int</tt>: Number of channels.
* This must be 1 or 2.
* @returns The size in bytes.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_decoder_get_size(int channels);
/** Allocates and initializes a decoder state.
* @param [in] Fs <tt>opus_int32</tt>: Sample rate to decode at (Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param [in] channels <tt>int</tt>: Number of channels (1 or 2) to decode
* @param [out] error <tt>int*</tt>: #OPUS_OK Success or @ref opus_errorcodes
*
* Internally Opus stores data at 48000 Hz, so that should be the default
* value for Fs. However, the decoder can efficiently decode to buffers
* at 8, 12, 16, and 24 kHz so if for some reason the caller cannot use
* data at the full sample rate, or knows the compressed data doesn't
* use the full frequency range, it can request decoding at a reduced
* rate. Likewise, the decoder is capable of filling in either mono or
* interleaved stereo pcm buffers, at the caller's request.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusDecoder *opus_decoder_create(
opus_int32 Fs,
int channels,
int *error
);
/** Initializes a previously allocated decoder state.
* The state must be at least the size returned by opus_decoder_get_size().
* This is intended for applications which use their own allocator instead of malloc. @see opus_decoder_create,opus_decoder_get_size
* To reset a previously initialized state, use the #OPUS_RESET_STATE CTL.
* @param [in] st <tt>OpusDecoder*</tt>: Decoder state.
* @param [in] Fs <tt>opus_int32</tt>: Sampling rate to decode to (Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param [in] channels <tt>int</tt>: Number of channels (1 or 2) to decode
* @retval #OPUS_OK Success or @ref opus_errorcodes
*/
OPUS_EXPORT int opus_decoder_init(
OpusDecoder *st,
opus_int32 Fs,
int channels
) OPUS_ARG_NONNULL(1);
/** Decode an Opus packet.
* @param [in] st <tt>OpusDecoder*</tt>: Decoder state
* @param [in] data <tt>char*</tt>: Input payload. Use a NULL pointer to indicate packet loss
* @param [in] len <tt>opus_int32</tt>: Number of bytes in payload*
* @param [out] pcm <tt>opus_int16*</tt>: Output signal (interleaved if 2 channels). length
* is frame_size*channels*sizeof(opus_int16)
* @param [in] frame_size Number of samples per channel of available space in \a pcm.
* If this is less than the maximum packet duration (120ms; 5760 for 48kHz), this function will
* not be capable of decoding some packets. In the case of PLC (data==NULL) or FEC (decode_fec=1),
* then frame_size needs to be exactly the duration of audio that is missing, otherwise the
* decoder will not be in the optimal state to decode the next incoming packet. For the PLC and
* FEC cases, frame_size <b>must</b> be a multiple of 2.5 ms.
* @param [in] decode_fec <tt>int</tt>: Flag (0 or 1) to request that any in-band forward error correction data be
* decoded. If no such data is available, the frame is decoded as if it were lost.
* @returns Number of decoded samples or @ref opus_errorcodes
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_decode(
OpusDecoder *st,
const unsigned char *data,
opus_int32 len,
opus_int16 *pcm,
int frame_size,
int decode_fec
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Decode an Opus packet with floating point output.
* @param [in] st <tt>OpusDecoder*</tt>: Decoder state
* @param [in] data <tt>char*</tt>: Input payload. Use a NULL pointer to indicate packet loss
* @param [in] len <tt>opus_int32</tt>: Number of bytes in payload
* @param [out] pcm <tt>float*</tt>: Output signal (interleaved if 2 channels). length
* is frame_size*channels*sizeof(float)
* @param [in] frame_size Number of samples per channel of available space in \a pcm.
* If this is less than the maximum packet duration (120ms; 5760 for 48kHz), this function will
* not be capable of decoding some packets. In the case of PLC (data==NULL) or FEC (decode_fec=1),
* then frame_size needs to be exactly the duration of audio that is missing, otherwise the
* decoder will not be in the optimal state to decode the next incoming packet. For the PLC and
* FEC cases, frame_size <b>must</b> be a multiple of 2.5 ms.
* @param [in] decode_fec <tt>int</tt>: Flag (0 or 1) to request that any in-band forward error correction data be
* decoded. If no such data is available the frame is decoded as if it were lost.
* @returns Number of decoded samples or @ref opus_errorcodes
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_decode_float(
OpusDecoder *st,
const unsigned char *data,
opus_int32 len,
float *pcm,
int frame_size,
int decode_fec
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Perform a CTL function on an Opus decoder.
*
* Generally the request and subsequent arguments are generated
* by a convenience macro.
* @param st <tt>OpusDecoder*</tt>: Decoder state.
* @param request This and all remaining parameters should be replaced by one
* of the convenience macros in @ref opus_genericctls or
* @ref opus_decoderctls.
* @see opus_genericctls
* @see opus_decoderctls
*/
OPUS_EXPORT int opus_decoder_ctl(OpusDecoder *st, int request, ...) OPUS_ARG_NONNULL(1);
/** Frees an <code>OpusDecoder</code> allocated by opus_decoder_create().
* @param[in] st <tt>OpusDecoder*</tt>: State to be freed.
*/
OPUS_EXPORT void opus_decoder_destroy(OpusDecoder *st);
/** Parse an opus packet into one or more frames.
* Opus_decode will perform this operation internally so most applications do
* not need to use this function.
* This function does not copy the frames, the returned pointers are pointers into
* the input packet.
* @param [in] data <tt>char*</tt>: Opus packet to be parsed
* @param [in] len <tt>opus_int32</tt>: size of data
* @param [out] out_toc <tt>char*</tt>: TOC pointer
* @param [out] frames <tt>char*[48]</tt> encapsulated frames
* @param [out] size <tt>opus_int16[48]</tt> sizes of the encapsulated frames
* @param [out] payload_offset <tt>int*</tt>: returns the position of the payload within the packet (in bytes)
* @returns number of frames
*/
OPUS_EXPORT int opus_packet_parse(
const unsigned char *data,
opus_int32 len,
unsigned char *out_toc,
const unsigned char *frames[48],
opus_int16 size[48],
int *payload_offset
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Gets the bandwidth of an Opus packet.
* @param [in] data <tt>char*</tt>: Opus packet
* @retval OPUS_BANDWIDTH_NARROWBAND Narrowband (4kHz bandpass)
* @retval OPUS_BANDWIDTH_MEDIUMBAND Mediumband (6kHz bandpass)
* @retval OPUS_BANDWIDTH_WIDEBAND Wideband (8kHz bandpass)
* @retval OPUS_BANDWIDTH_SUPERWIDEBAND Superwideband (12kHz bandpass)
* @retval OPUS_BANDWIDTH_FULLBAND Fullband (20kHz bandpass)
* @retval OPUS_INVALID_PACKET The compressed data passed is corrupted or of an unsupported type
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_packet_get_bandwidth(const unsigned char *data) OPUS_ARG_NONNULL(1);
/** Gets the number of samples per frame from an Opus packet.
* @param [in] data <tt>char*</tt>: Opus packet.
* This must contain at least one byte of
* data.
* @param [in] Fs <tt>opus_int32</tt>: Sampling rate in Hz.
* This must be a multiple of 400, or
* inaccurate results will be returned.
* @returns Number of samples per frame.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_packet_get_samples_per_frame(const unsigned char *data, opus_int32 Fs) OPUS_ARG_NONNULL(1);
/** Gets the number of channels from an Opus packet.
* @param [in] data <tt>char*</tt>: Opus packet
* @returns Number of channels
* @retval OPUS_INVALID_PACKET The compressed data passed is corrupted or of an unsupported type
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_packet_get_nb_channels(const unsigned char *data) OPUS_ARG_NONNULL(1);
/** Gets the number of frames in an Opus packet.
* @param [in] packet <tt>char*</tt>: Opus packet
* @param [in] len <tt>opus_int32</tt>: Length of packet
* @returns Number of frames
* @retval OPUS_BAD_ARG Insufficient data was passed to the function
* @retval OPUS_INVALID_PACKET The compressed data passed is corrupted or of an unsupported type
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_packet_get_nb_frames(const unsigned char packet[], opus_int32 len) OPUS_ARG_NONNULL(1);
/** Gets the number of samples of an Opus packet.
* @param [in] packet <tt>char*</tt>: Opus packet
* @param [in] len <tt>opus_int32</tt>: Length of packet
* @param [in] Fs <tt>opus_int32</tt>: Sampling rate in Hz.
* This must be a multiple of 400, or
* inaccurate results will be returned.
* @returns Number of samples
* @retval OPUS_BAD_ARG Insufficient data was passed to the function
* @retval OPUS_INVALID_PACKET The compressed data passed is corrupted or of an unsupported type
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_packet_get_nb_samples(const unsigned char packet[], opus_int32 len, opus_int32 Fs) OPUS_ARG_NONNULL(1);
/** Gets the number of samples of an Opus packet.
* @param [in] dec <tt>OpusDecoder*</tt>: Decoder state
* @param [in] packet <tt>char*</tt>: Opus packet
* @param [in] len <tt>opus_int32</tt>: Length of packet
* @returns Number of samples
* @retval OPUS_BAD_ARG Insufficient data was passed to the function
* @retval OPUS_INVALID_PACKET The compressed data passed is corrupted or of an unsupported type
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_decoder_get_nb_samples(const OpusDecoder *dec, const unsigned char packet[], opus_int32 len) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2);
/** Applies soft-clipping to bring a float signal within the [-1,1] range. If
* the signal is already in that range, nothing is done. If there are values
* outside of [-1,1], then the signal is clipped as smoothly as possible to
* both fit in the range and avoid creating excessive distortion in the
* process.
* @param [in,out] pcm <tt>float*</tt>: Input PCM and modified PCM
* @param [in] frame_size <tt>int</tt> Number of samples per channel to process
* @param [in] channels <tt>int</tt>: Number of channels
* @param [in,out] softclip_mem <tt>float*</tt>: State memory for the soft clipping process (one float per channel, initialized to zero)
*/
OPUS_EXPORT void opus_pcm_soft_clip(float *pcm, int frame_size, int channels, float *softclip_mem);
/**@}*/
/** @defgroup opus_repacketizer Repacketizer
* @{
*
* The repacketizer can be used to merge multiple Opus packets into a single
* packet or alternatively to split Opus packets that have previously been
* merged. Splitting valid Opus packets is always guaranteed to succeed,
* whereas merging valid packets only succeeds if all frames have the same
* mode, bandwidth, and frame size, and when the total duration of the merged
* packet is no more than 120 ms.
* The repacketizer currently only operates on elementary Opus
* streams. It will not manipualte multistream packets successfully, except in
* the degenerate case where they consist of data from a single stream.
*
* The repacketizing process starts with creating a repacketizer state, either
* by calling opus_repacketizer_create() or by allocating the memory yourself,
* e.g.,
* @code
* OpusRepacketizer *rp;
* rp = (OpusRepacketizer*)malloc(opus_repacketizer_get_size());
* if (rp != NULL)
* opus_repacketizer_init(rp);
* @endcode
*
* Then the application should submit packets with opus_repacketizer_cat(),
* extract new packets with opus_repacketizer_out() or
* opus_repacketizer_out_range(), and then reset the state for the next set of
* input packets via opus_repacketizer_init().
*
* For example, to split a sequence of packets into individual frames:
* @code
* unsigned char *data;
* int len;
* while (get_next_packet(&data, &len))
* {
* unsigned char out[1276];
* opus_int32 out_len;
* int nb_frames;
* int err;
* int i;
* err = opus_repacketizer_cat(rp, data, len);
* if (err != OPUS_OK)
* {
* release_packet(data);
* return err;
* }
* nb_frames = opus_repacketizer_get_nb_frames(rp);
* for (i = 0; i < nb_frames; i++)
* {
* out_len = opus_repacketizer_out_range(rp, i, i+1, out, sizeof(out));
* if (out_len < 0)
* {
* release_packet(data);
* return (int)out_len;
* }
* output_next_packet(out, out_len);
* }
* opus_repacketizer_init(rp);
* release_packet(data);
* }
* @endcode
*
* Alternatively, to combine a sequence of frames into packets that each
* contain up to <code>TARGET_DURATION_MS</code> milliseconds of data:
* @code
* // The maximum number of packets with duration TARGET_DURATION_MS occurs
* // when the frame size is 2.5 ms, for a total of (TARGET_DURATION_MS*2/5)
* // packets.
* unsigned char *data[(TARGET_DURATION_MS*2/5)+1];
* opus_int32 len[(TARGET_DURATION_MS*2/5)+1];
* int nb_packets;
* unsigned char out[1277*(TARGET_DURATION_MS*2/2)];
* opus_int32 out_len;
* int prev_toc;
* nb_packets = 0;
* while (get_next_packet(data+nb_packets, len+nb_packets))
* {
* int nb_frames;
* int err;
* nb_frames = opus_packet_get_nb_frames(data[nb_packets], len[nb_packets]);
* if (nb_frames < 1)
* {
* release_packets(data, nb_packets+1);
* return nb_frames;
* }
* nb_frames += opus_repacketizer_get_nb_frames(rp);
* // If adding the next packet would exceed our target, or it has an
* // incompatible TOC sequence, output the packets we already have before
* // submitting it.
* // N.B., The nb_packets > 0 check ensures we've submitted at least one
* // packet since the last call to opus_repacketizer_init(). Otherwise a
* // single packet longer than TARGET_DURATION_MS would cause us to try to
* // output an (invalid) empty packet. It also ensures that prev_toc has
* // been set to a valid value. Additionally, len[nb_packets] > 0 is
* // guaranteed by the call to opus_packet_get_nb_frames() above, so the
* // reference to data[nb_packets][0] should be valid.
* if (nb_packets > 0 && (
* ((prev_toc & 0xFC) != (data[nb_packets][0] & 0xFC)) ||
* opus_packet_get_samples_per_frame(data[nb_packets], 48000)*nb_frames >
* TARGET_DURATION_MS*48))
* {
* out_len = opus_repacketizer_out(rp, out, sizeof(out));
* if (out_len < 0)
* {
* release_packets(data, nb_packets+1);
* return (int)out_len;
* }
* output_next_packet(out, out_len);
* opus_repacketizer_init(rp);
* release_packets(data, nb_packets);
* data[0] = data[nb_packets];
* len[0] = len[nb_packets];
* nb_packets = 0;
* }
* err = opus_repacketizer_cat(rp, data[nb_packets], len[nb_packets]);
* if (err != OPUS_OK)
* {
* release_packets(data, nb_packets+1);
* return err;
* }
* prev_toc = data[nb_packets][0];
* nb_packets++;
* }
* // Output the final, partial packet.
* if (nb_packets > 0)
* {
* out_len = opus_repacketizer_out(rp, out, sizeof(out));
* release_packets(data, nb_packets);
* if (out_len < 0)
* return (int)out_len;
* output_next_packet(out, out_len);
* }
* @endcode
*
* An alternate way of merging packets is to simply call opus_repacketizer_cat()
* unconditionally until it fails. At that point, the merged packet can be
* obtained with opus_repacketizer_out() and the input packet for which
* opus_repacketizer_cat() needs to be re-added to a newly reinitialized
* repacketizer state.
*/
typedef struct OpusRepacketizer OpusRepacketizer;
/** Gets the size of an <code>OpusRepacketizer</code> structure.
* @returns The size in bytes.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_repacketizer_get_size(void);
/** (Re)initializes a previously allocated repacketizer state.
* The state must be at least the size returned by opus_repacketizer_get_size().
* This can be used for applications which use their own allocator instead of
* malloc().
* It must also be called to reset the queue of packets waiting to be
* repacketized, which is necessary if the maximum packet duration of 120 ms
* is reached or if you wish to submit packets with a different Opus
* configuration (coding mode, audio bandwidth, frame size, or channel count).
* Failure to do so will prevent a new packet from being added with
* opus_repacketizer_cat().
* @see opus_repacketizer_create
* @see opus_repacketizer_get_size
* @see opus_repacketizer_cat
* @param rp <tt>OpusRepacketizer*</tt>: The repacketizer state to
* (re)initialize.
* @returns A pointer to the same repacketizer state that was passed in.
*/
OPUS_EXPORT OpusRepacketizer *opus_repacketizer_init(OpusRepacketizer *rp) OPUS_ARG_NONNULL(1);
/** Allocates memory and initializes the new repacketizer with
* opus_repacketizer_init().
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusRepacketizer *opus_repacketizer_create(void);
/** Frees an <code>OpusRepacketizer</code> allocated by
* opus_repacketizer_create().
* @param[in] rp <tt>OpusRepacketizer*</tt>: State to be freed.
*/
OPUS_EXPORT void opus_repacketizer_destroy(OpusRepacketizer *rp);
/** Add a packet to the current repacketizer state.
* This packet must match the configuration of any packets already submitted
* for repacketization since the last call to opus_repacketizer_init().
* This means that it must have the same coding mode, audio bandwidth, frame
* size, and channel count.
* This can be checked in advance by examining the top 6 bits of the first
* byte of the packet, and ensuring they match the top 6 bits of the first
* byte of any previously submitted packet.
* The total duration of audio in the repacketizer state also must not exceed
* 120 ms, the maximum duration of a single packet, after adding this packet.
*
* The contents of the current repacketizer state can be extracted into new
* packets using opus_repacketizer_out() or opus_repacketizer_out_range().
*
* In order to add a packet with a different configuration or to add more
* audio beyond 120 ms, you must clear the repacketizer state by calling
* opus_repacketizer_init().
* If a packet is too large to add to the current repacketizer state, no part
* of it is added, even if it contains multiple frames, some of which might
* fit.
* If you wish to be able to add parts of such packets, you should first use
* another repacketizer to split the packet into pieces and add them
* individually.
* @see opus_repacketizer_out_range
* @see opus_repacketizer_out
* @see opus_repacketizer_init
* @param rp <tt>OpusRepacketizer*</tt>: The repacketizer state to which to
* add the packet.
* @param[in] data <tt>const unsigned char*</tt>: The packet data.
* The application must ensure
* this pointer remains valid
* until the next call to
* opus_repacketizer_init() or
* opus_repacketizer_destroy().
* @param len <tt>opus_int32</tt>: The number of bytes in the packet data.
* @returns An error code indicating whether or not the operation succeeded.
* @retval #OPUS_OK The packet's contents have been added to the repacketizer
* state.
* @retval #OPUS_INVALID_PACKET The packet did not have a valid TOC sequence,
* the packet's TOC sequence was not compatible
* with previously submitted packets (because
* the coding mode, audio bandwidth, frame size,
* or channel count did not match), or adding
* this packet would increase the total amount of
* audio stored in the repacketizer state to more
* than 120 ms.
*/
OPUS_EXPORT int opus_repacketizer_cat(OpusRepacketizer *rp, const unsigned char *data, opus_int32 len) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2);
/** Construct a new packet from data previously submitted to the repacketizer
* state via opus_repacketizer_cat().
* @param rp <tt>OpusRepacketizer*</tt>: The repacketizer state from which to
* construct the new packet.
* @param begin <tt>int</tt>: The index of the first frame in the current
* repacketizer state to include in the output.
* @param end <tt>int</tt>: One past the index of the last frame in the
* current repacketizer state to include in the
* output.
* @param[out] data <tt>const unsigned char*</tt>: The buffer in which to
* store the output packet.
* @param maxlen <tt>opus_int32</tt>: The maximum number of bytes to store in
* the output buffer. In order to guarantee
* success, this should be at least
* <code>1276</code> for a single frame,
* or for multiple frames,
* <code>1277*(end-begin)</code>.
* However, <code>1*(end-begin)</code> plus
* the size of all packet data submitted to
* the repacketizer since the last call to
* opus_repacketizer_init() or
* opus_repacketizer_create() is also
* sufficient, and possibly much smaller.
* @returns The total size of the output packet on success, or an error code
* on failure.
* @retval #OPUS_BAD_ARG <code>[begin,end)</code> was an invalid range of
* frames (begin < 0, begin >= end, or end >
* opus_repacketizer_get_nb_frames()).
* @retval #OPUS_BUFFER_TOO_SMALL \a maxlen was insufficient to contain the
* complete output packet.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_repacketizer_out_range(OpusRepacketizer *rp, int begin, int end, unsigned char *data, opus_int32 maxlen) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Return the total number of frames contained in packet data submitted to
* the repacketizer state so far via opus_repacketizer_cat() since the last
* call to opus_repacketizer_init() or opus_repacketizer_create().
* This defines the valid range of packets that can be extracted with
* opus_repacketizer_out_range() or opus_repacketizer_out().
* @param rp <tt>OpusRepacketizer*</tt>: The repacketizer state containing the
* frames.
* @returns The total number of frames contained in the packet data submitted
* to the repacketizer state.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_repacketizer_get_nb_frames(OpusRepacketizer *rp) OPUS_ARG_NONNULL(1);
/** Construct a new packet from data previously submitted to the repacketizer
* state via opus_repacketizer_cat().
* This is a convenience routine that returns all the data submitted so far
* in a single packet.
* It is equivalent to calling
* @code
* opus_repacketizer_out_range(rp, 0, opus_repacketizer_get_nb_frames(rp),
* data, maxlen)
* @endcode
* @param rp <tt>OpusRepacketizer*</tt>: The repacketizer state from which to
* construct the new packet.
* @param[out] data <tt>const unsigned char*</tt>: The buffer in which to
* store the output packet.
* @param maxlen <tt>opus_int32</tt>: The maximum number of bytes to store in
* the output buffer. In order to guarantee
* success, this should be at least
* <code>1277*opus_repacketizer_get_nb_frames(rp)</code>.
* However,
* <code>1*opus_repacketizer_get_nb_frames(rp)</code>
* plus the size of all packet data
* submitted to the repacketizer since the
* last call to opus_repacketizer_init() or
* opus_repacketizer_create() is also
* sufficient, and possibly much smaller.
* @returns The total size of the output packet on success, or an error code
* on failure.
* @retval #OPUS_BUFFER_TOO_SMALL \a maxlen was insufficient to contain the
* complete output packet.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_repacketizer_out(OpusRepacketizer *rp, unsigned char *data, opus_int32 maxlen) OPUS_ARG_NONNULL(1);
/** Pads a given Opus packet to a larger size (possibly changing the TOC sequence).
* @param[in,out] data <tt>const unsigned char*</tt>: The buffer containing the
* packet to pad.
* @param len <tt>opus_int32</tt>: The size of the packet.
* This must be at least 1.
* @param new_len <tt>opus_int32</tt>: The desired size of the packet after padding.
* This must be at least as large as len.
* @returns an error code
* @retval #OPUS_OK \a on success.
* @retval #OPUS_BAD_ARG \a len was less than 1 or new_len was less than len.
* @retval #OPUS_INVALID_PACKET \a data did not contain a valid Opus packet.
*/
OPUS_EXPORT int opus_packet_pad(unsigned char *data, opus_int32 len, opus_int32 new_len);
/** Remove all padding from a given Opus packet and rewrite the TOC sequence to
* minimize space usage.
* @param[in,out] data <tt>const unsigned char*</tt>: The buffer containing the
* packet to strip.
* @param len <tt>opus_int32</tt>: The size of the packet.
* This must be at least 1.
* @returns The new size of the output packet on success, or an error code
* on failure.
* @retval #OPUS_BAD_ARG \a len was less than 1.
* @retval #OPUS_INVALID_PACKET \a data did not contain a valid Opus packet.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_packet_unpad(unsigned char *data, opus_int32 len);
/** Pads a given Opus multi-stream packet to a larger size (possibly changing the TOC sequence).
* @param[in,out] data <tt>const unsigned char*</tt>: The buffer containing the
* packet to pad.
* @param len <tt>opus_int32</tt>: The size of the packet.
* This must be at least 1.
* @param new_len <tt>opus_int32</tt>: The desired size of the packet after padding.
* This must be at least 1.
* @param nb_streams <tt>opus_int32</tt>: The number of streams (not channels) in the packet.
* This must be at least as large as len.
* @returns an error code
* @retval #OPUS_OK \a on success.
* @retval #OPUS_BAD_ARG \a len was less than 1.
* @retval #OPUS_INVALID_PACKET \a data did not contain a valid Opus packet.
*/
OPUS_EXPORT int opus_multistream_packet_pad(unsigned char *data, opus_int32 len, opus_int32 new_len, int nb_streams);
/** Remove all padding from a given Opus multi-stream packet and rewrite the TOC sequence to
* minimize space usage.
* @param[in,out] data <tt>const unsigned char*</tt>: The buffer containing the
* packet to strip.
* @param len <tt>opus_int32</tt>: The size of the packet.
* This must be at least 1.
* @param nb_streams <tt>opus_int32</tt>: The number of streams (not channels) in the packet.
* This must be at least 1.
* @returns The new size of the output packet on success, or an error code
* on failure.
* @retval #OPUS_BAD_ARG \a len was less than 1 or new_len was less than len.
* @retval #OPUS_INVALID_PACKET \a data did not contain a valid Opus packet.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_multistream_packet_unpad(unsigned char *data, opus_int32 len, int nb_streams);
/**@}*/
#ifdef __cplusplus
}
#endif
#endif /* OPUS_H */

View File

@@ -1,342 +0,0 @@
/* Copyright (c) 2007-2008 CSIRO
Copyright (c) 2007-2009 Xiph.Org Foundation
Copyright (c) 2008-2012 Gregory Maxwell
Written by Jean-Marc Valin and Gregory Maxwell */
/*
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**
@file opus_custom.h
@brief Opus-Custom reference implementation API
*/
#ifndef OPUS_CUSTOM_H
#define OPUS_CUSTOM_H
#include "opus_defines.h"
#ifdef __cplusplus
extern "C" {
#endif
#ifdef CUSTOM_MODES
# define OPUS_CUSTOM_EXPORT OPUS_EXPORT
# define OPUS_CUSTOM_EXPORT_STATIC OPUS_EXPORT
#else
# define OPUS_CUSTOM_EXPORT
# ifdef OPUS_BUILD
# define OPUS_CUSTOM_EXPORT_STATIC static OPUS_INLINE
# else
# define OPUS_CUSTOM_EXPORT_STATIC
# endif
#endif
/** @defgroup opus_custom Opus Custom
* @{
* Opus Custom is an optional part of the Opus specification and
* reference implementation which uses a distinct API from the regular
* API and supports frame sizes that are not normally supported.\ Use
* of Opus Custom is discouraged for all but very special applications
* for which a frame size different from 2.5, 5, 10, or 20 ms is needed
* (for either complexity or latency reasons) and where interoperability
* is less important.
*
* In addition to the interoperability limitations the use of Opus custom
* disables a substantial chunk of the codec and generally lowers the
* quality available at a given bitrate. Normally when an application needs
* a different frame size from the codec it should buffer to match the
* sizes but this adds a small amount of delay which may be important
* in some very low latency applications. Some transports (especially
* constant rate RF transports) may also work best with frames of
* particular durations.
*
* Libopus only supports custom modes if they are enabled at compile time.
*
* The Opus Custom API is similar to the regular API but the
* @ref opus_encoder_create and @ref opus_decoder_create calls take
* an additional mode parameter which is a structure produced by
* a call to @ref opus_custom_mode_create. Both the encoder and decoder
* must create a mode using the same sample rate (fs) and frame size
* (frame size) so these parameters must either be signaled out of band
* or fixed in a particular implementation.
*
* Similar to regular Opus the custom modes support on the fly frame size
* switching, but the sizes available depend on the particular frame size in
* use. For some initial frame sizes on a single on the fly size is available.
*/
/** Contains the state of an encoder. One encoder state is needed
for each stream. It is initialized once at the beginning of the
stream. Do *not* re-initialize the state for every frame.
@brief Encoder state
*/
typedef struct OpusCustomEncoder OpusCustomEncoder;
/** State of the decoder. One decoder state is needed for each stream.
It is initialized once at the beginning of the stream. Do *not*
re-initialize the state for every frame.
@brief Decoder state
*/
typedef struct OpusCustomDecoder OpusCustomDecoder;
/** The mode contains all the information necessary to create an
encoder. Both the encoder and decoder need to be initialized
with exactly the same mode, otherwise the output will be
corrupted.
@brief Mode configuration
*/
typedef struct OpusCustomMode OpusCustomMode;
/** Creates a new mode struct. This will be passed to an encoder or
* decoder. The mode MUST NOT BE DESTROYED until the encoders and
* decoders that use it are destroyed as well.
* @param [in] Fs <tt>int</tt>: Sampling rate (8000 to 96000 Hz)
* @param [in] frame_size <tt>int</tt>: Number of samples (per channel) to encode in each
* packet (64 - 1024, prime factorization must contain zero or more 2s, 3s, or 5s and no other primes)
* @param [out] error <tt>int*</tt>: Returned error code (if NULL, no error will be returned)
* @return A newly created mode
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT OpusCustomMode *opus_custom_mode_create(opus_int32 Fs, int frame_size, int *error);
/** Destroys a mode struct. Only call this after all encoders and
* decoders using this mode are destroyed as well.
* @param [in] mode <tt>OpusCustomMode*</tt>: Mode to be freed.
*/
OPUS_CUSTOM_EXPORT void opus_custom_mode_destroy(OpusCustomMode *mode);
#if !defined(OPUS_BUILD) || defined(CELT_ENCODER_C)
/* Encoder */
/** Gets the size of an OpusCustomEncoder structure.
* @param [in] mode <tt>OpusCustomMode *</tt>: Mode configuration
* @param [in] channels <tt>int</tt>: Number of channels
* @returns size
*/
OPUS_CUSTOM_EXPORT_STATIC OPUS_WARN_UNUSED_RESULT int opus_custom_encoder_get_size(
const OpusCustomMode *mode,
int channels
) OPUS_ARG_NONNULL(1);
# ifdef CUSTOM_MODES
/** Initializes a previously allocated encoder state
* The memory pointed to by st must be the size returned by opus_custom_encoder_get_size.
* This is intended for applications which use their own allocator instead of malloc.
* @see opus_custom_encoder_create(),opus_custom_encoder_get_size()
* To reset a previously initialized state use the OPUS_RESET_STATE CTL.
* @param [in] st <tt>OpusCustomEncoder*</tt>: Encoder state
* @param [in] mode <tt>OpusCustomMode *</tt>: Contains all the information about the characteristics of
* the stream (must be the same characteristics as used for the
* decoder)
* @param [in] channels <tt>int</tt>: Number of channels
* @return OPUS_OK Success or @ref opus_errorcodes
*/
OPUS_CUSTOM_EXPORT int opus_custom_encoder_init(
OpusCustomEncoder *st,
const OpusCustomMode *mode,
int channels
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2);
# endif
#endif
/** Creates a new encoder state. Each stream needs its own encoder
* state (can't be shared across simultaneous streams).
* @param [in] mode <tt>OpusCustomMode*</tt>: Contains all the information about the characteristics of
* the stream (must be the same characteristics as used for the
* decoder)
* @param [in] channels <tt>int</tt>: Number of channels
* @param [out] error <tt>int*</tt>: Returns an error code
* @return Newly created encoder state.
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT OpusCustomEncoder *opus_custom_encoder_create(
const OpusCustomMode *mode,
int channels,
int *error
) OPUS_ARG_NONNULL(1);
/** Destroys a an encoder state.
* @param[in] st <tt>OpusCustomEncoder*</tt>: State to be freed.
*/
OPUS_CUSTOM_EXPORT void opus_custom_encoder_destroy(OpusCustomEncoder *st);
/** Encodes a frame of audio.
* @param [in] st <tt>OpusCustomEncoder*</tt>: Encoder state
* @param [in] pcm <tt>float*</tt>: PCM audio in float format, with a normal range of +/-1.0.
* Samples with a range beyond +/-1.0 are supported but will
* be clipped by decoders using the integer API and should
* only be used if it is known that the far end supports
* extended dynamic range. There must be exactly
* frame_size samples per channel.
* @param [in] frame_size <tt>int</tt>: Number of samples per frame of input signal
* @param [out] compressed <tt>char *</tt>: The compressed data is written here. This may not alias pcm and must be at least maxCompressedBytes long.
* @param [in] maxCompressedBytes <tt>int</tt>: Maximum number of bytes to use for compressing the frame
* (can change from one frame to another)
* @return Number of bytes written to "compressed".
* If negative, an error has occurred (see error codes). It is IMPORTANT that
* the length returned be somehow transmitted to the decoder. Otherwise, no
* decoding is possible.
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT int opus_custom_encode_float(
OpusCustomEncoder *st,
const float *pcm,
int frame_size,
unsigned char *compressed,
int maxCompressedBytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Encodes a frame of audio.
* @param [in] st <tt>OpusCustomEncoder*</tt>: Encoder state
* @param [in] pcm <tt>opus_int16*</tt>: PCM audio in signed 16-bit format (native endian).
* There must be exactly frame_size samples per channel.
* @param [in] frame_size <tt>int</tt>: Number of samples per frame of input signal
* @param [out] compressed <tt>char *</tt>: The compressed data is written here. This may not alias pcm and must be at least maxCompressedBytes long.
* @param [in] maxCompressedBytes <tt>int</tt>: Maximum number of bytes to use for compressing the frame
* (can change from one frame to another)
* @return Number of bytes written to "compressed".
* If negative, an error has occurred (see error codes). It is IMPORTANT that
* the length returned be somehow transmitted to the decoder. Otherwise, no
* decoding is possible.
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT int opus_custom_encode(
OpusCustomEncoder *st,
const opus_int16 *pcm,
int frame_size,
unsigned char *compressed,
int maxCompressedBytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Perform a CTL function on an Opus custom encoder.
*
* Generally the request and subsequent arguments are generated
* by a convenience macro.
* @see opus_encoderctls
*/
OPUS_CUSTOM_EXPORT int opus_custom_encoder_ctl(OpusCustomEncoder * OPUS_RESTRICT st, int request, ...) OPUS_ARG_NONNULL(1);
#if !defined(OPUS_BUILD) || defined(CELT_DECODER_C)
/* Decoder */
/** Gets the size of an OpusCustomDecoder structure.
* @param [in] mode <tt>OpusCustomMode *</tt>: Mode configuration
* @param [in] channels <tt>int</tt>: Number of channels
* @returns size
*/
OPUS_CUSTOM_EXPORT_STATIC OPUS_WARN_UNUSED_RESULT int opus_custom_decoder_get_size(
const OpusCustomMode *mode,
int channels
) OPUS_ARG_NONNULL(1);
/** Initializes a previously allocated decoder state
* The memory pointed to by st must be the size returned by opus_custom_decoder_get_size.
* This is intended for applications which use their own allocator instead of malloc.
* @see opus_custom_decoder_create(),opus_custom_decoder_get_size()
* To reset a previously initialized state use the OPUS_RESET_STATE CTL.
* @param [in] st <tt>OpusCustomDecoder*</tt>: Decoder state
* @param [in] mode <tt>OpusCustomMode *</tt>: Contains all the information about the characteristics of
* the stream (must be the same characteristics as used for the
* encoder)
* @param [in] channels <tt>int</tt>: Number of channels
* @return OPUS_OK Success or @ref opus_errorcodes
*/
OPUS_CUSTOM_EXPORT_STATIC int opus_custom_decoder_init(
OpusCustomDecoder *st,
const OpusCustomMode *mode,
int channels
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2);
#endif
/** Creates a new decoder state. Each stream needs its own decoder state (can't
* be shared across simultaneous streams).
* @param [in] mode <tt>OpusCustomMode</tt>: Contains all the information about the characteristics of the
* stream (must be the same characteristics as used for the encoder)
* @param [in] channels <tt>int</tt>: Number of channels
* @param [out] error <tt>int*</tt>: Returns an error code
* @return Newly created decoder state.
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT OpusCustomDecoder *opus_custom_decoder_create(
const OpusCustomMode *mode,
int channels,
int *error
) OPUS_ARG_NONNULL(1);
/** Destroys a an decoder state.
* @param[in] st <tt>OpusCustomDecoder*</tt>: State to be freed.
*/
OPUS_CUSTOM_EXPORT void opus_custom_decoder_destroy(OpusCustomDecoder *st);
/** Decode an opus custom frame with floating point output
* @param [in] st <tt>OpusCustomDecoder*</tt>: Decoder state
* @param [in] data <tt>char*</tt>: Input payload. Use a NULL pointer to indicate packet loss
* @param [in] len <tt>int</tt>: Number of bytes in payload
* @param [out] pcm <tt>float*</tt>: Output signal (interleaved if 2 channels). length
* is frame_size*channels*sizeof(float)
* @param [in] frame_size Number of samples per channel of available space in *pcm.
* @returns Number of decoded samples or @ref opus_errorcodes
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT int opus_custom_decode_float(
OpusCustomDecoder *st,
const unsigned char *data,
int len,
float *pcm,
int frame_size
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Decode an opus custom frame
* @param [in] st <tt>OpusCustomDecoder*</tt>: Decoder state
* @param [in] data <tt>char*</tt>: Input payload. Use a NULL pointer to indicate packet loss
* @param [in] len <tt>int</tt>: Number of bytes in payload
* @param [out] pcm <tt>opus_int16*</tt>: Output signal (interleaved if 2 channels). length
* is frame_size*channels*sizeof(opus_int16)
* @param [in] frame_size Number of samples per channel of available space in *pcm.
* @returns Number of decoded samples or @ref opus_errorcodes
*/
OPUS_CUSTOM_EXPORT OPUS_WARN_UNUSED_RESULT int opus_custom_decode(
OpusCustomDecoder *st,
const unsigned char *data,
int len,
opus_int16 *pcm,
int frame_size
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Perform a CTL function on an Opus custom decoder.
*
* Generally the request and subsequent arguments are generated
* by a convenience macro.
* @see opus_genericctls
*/
OPUS_CUSTOM_EXPORT int opus_custom_decoder_ctl(OpusCustomDecoder * OPUS_RESTRICT st, int request, ...) OPUS_ARG_NONNULL(1);
/**@}*/
#ifdef __cplusplus
}
#endif
#endif /* OPUS_CUSTOM_H */

View File

@@ -1,726 +0,0 @@
/* Copyright (c) 2010-2011 Xiph.Org Foundation, Skype Limited
Written by Jean-Marc Valin and Koen Vos */
/*
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**
* @file opus_defines.h
* @brief Opus reference implementation constants
*/
#ifndef OPUS_DEFINES_H
#define OPUS_DEFINES_H
#include "opus_types.h"
#ifdef __cplusplus
extern "C" {
#endif
/** @defgroup opus_errorcodes Error codes
* @{
*/
/** No error @hideinitializer*/
#define OPUS_OK 0
/** One or more invalid/out of range arguments @hideinitializer*/
#define OPUS_BAD_ARG -1
/** Not enough bytes allocated in the buffer @hideinitializer*/
#define OPUS_BUFFER_TOO_SMALL -2
/** An internal error was detected @hideinitializer*/
#define OPUS_INTERNAL_ERROR -3
/** The compressed data passed is corrupted @hideinitializer*/
#define OPUS_INVALID_PACKET -4
/** Invalid/unsupported request number @hideinitializer*/
#define OPUS_UNIMPLEMENTED -5
/** An encoder or decoder structure is invalid or already freed @hideinitializer*/
#define OPUS_INVALID_STATE -6
/** Memory allocation has failed @hideinitializer*/
#define OPUS_ALLOC_FAIL -7
/**@}*/
/** @cond OPUS_INTERNAL_DOC */
/**Export control for opus functions */
#ifndef OPUS_EXPORT
# if defined(WIN32)
# ifdef OPUS_BUILD
# define OPUS_EXPORT __declspec(dllexport)
# else
# define OPUS_EXPORT
# endif
# elif defined(__GNUC__) && defined(OPUS_BUILD)
# define OPUS_EXPORT __attribute__ ((visibility ("default")))
# else
# define OPUS_EXPORT
# endif
#endif
# if !defined(OPUS_GNUC_PREREQ)
# if defined(__GNUC__)&&defined(__GNUC_MINOR__)
# define OPUS_GNUC_PREREQ(_maj,_min) \
((__GNUC__<<16)+__GNUC_MINOR__>=((_maj)<<16)+(_min))
# else
# define OPUS_GNUC_PREREQ(_maj,_min) 0
# endif
# endif
#if (!defined(__STDC_VERSION__) || (__STDC_VERSION__ < 199901L) )
# if OPUS_GNUC_PREREQ(3,0)
# define OPUS_RESTRICT __restrict__
# elif (defined(_MSC_VER) && _MSC_VER >= 1400)
# define OPUS_RESTRICT __restrict
# else
# define OPUS_RESTRICT
# endif
#else
# define OPUS_RESTRICT restrict
#endif
#if (!defined(__STDC_VERSION__) || (__STDC_VERSION__ < 199901L) )
# if OPUS_GNUC_PREREQ(2,7)
# define OPUS_INLINE __inline__
# elif (defined(_MSC_VER))
# define OPUS_INLINE __inline
# else
# define OPUS_INLINE
# endif
#else
# define OPUS_INLINE inline
#endif
/**Warning attributes for opus functions
* NONNULL is not used in OPUS_BUILD to avoid the compiler optimizing out
* some paranoid null checks. */
#if defined(__GNUC__) && OPUS_GNUC_PREREQ(3, 4)
# define OPUS_WARN_UNUSED_RESULT __attribute__ ((__warn_unused_result__))
#else
# define OPUS_WARN_UNUSED_RESULT
#endif
#if !defined(OPUS_BUILD) && defined(__GNUC__) && OPUS_GNUC_PREREQ(3, 4)
# define OPUS_ARG_NONNULL(_x) __attribute__ ((__nonnull__(_x)))
#else
# define OPUS_ARG_NONNULL(_x)
#endif
/** These are the actual Encoder CTL ID numbers.
* They should not be used directly by applications.
* In general, SETs should be even and GETs should be odd.*/
#define OPUS_SET_APPLICATION_REQUEST 4000
#define OPUS_GET_APPLICATION_REQUEST 4001
#define OPUS_SET_BITRATE_REQUEST 4002
#define OPUS_GET_BITRATE_REQUEST 4003
#define OPUS_SET_MAX_BANDWIDTH_REQUEST 4004
#define OPUS_GET_MAX_BANDWIDTH_REQUEST 4005
#define OPUS_SET_VBR_REQUEST 4006
#define OPUS_GET_VBR_REQUEST 4007
#define OPUS_SET_BANDWIDTH_REQUEST 4008
#define OPUS_GET_BANDWIDTH_REQUEST 4009
#define OPUS_SET_COMPLEXITY_REQUEST 4010
#define OPUS_GET_COMPLEXITY_REQUEST 4011
#define OPUS_SET_INBAND_FEC_REQUEST 4012
#define OPUS_GET_INBAND_FEC_REQUEST 4013
#define OPUS_SET_PACKET_LOSS_PERC_REQUEST 4014
#define OPUS_GET_PACKET_LOSS_PERC_REQUEST 4015
#define OPUS_SET_DTX_REQUEST 4016
#define OPUS_GET_DTX_REQUEST 4017
#define OPUS_SET_VBR_CONSTRAINT_REQUEST 4020
#define OPUS_GET_VBR_CONSTRAINT_REQUEST 4021
#define OPUS_SET_FORCE_CHANNELS_REQUEST 4022
#define OPUS_GET_FORCE_CHANNELS_REQUEST 4023
#define OPUS_SET_SIGNAL_REQUEST 4024
#define OPUS_GET_SIGNAL_REQUEST 4025
#define OPUS_GET_LOOKAHEAD_REQUEST 4027
/* #define OPUS_RESET_STATE 4028 */
#define OPUS_GET_SAMPLE_RATE_REQUEST 4029
#define OPUS_GET_FINAL_RANGE_REQUEST 4031
#define OPUS_GET_PITCH_REQUEST 4033
#define OPUS_SET_GAIN_REQUEST 4034
#define OPUS_GET_GAIN_REQUEST 4045 /* Should have been 4035 */
#define OPUS_SET_LSB_DEPTH_REQUEST 4036
#define OPUS_GET_LSB_DEPTH_REQUEST 4037
#define OPUS_GET_LAST_PACKET_DURATION_REQUEST 4039
#define OPUS_SET_EXPERT_FRAME_DURATION_REQUEST 4040
#define OPUS_GET_EXPERT_FRAME_DURATION_REQUEST 4041
#define OPUS_SET_PREDICTION_DISABLED_REQUEST 4042
#define OPUS_GET_PREDICTION_DISABLED_REQUEST 4043
/* Don't use 4045, it's already taken by OPUS_GET_GAIN_REQUEST */
/* Macros to trigger compilation errors when the wrong types are provided to a CTL */
#define __opus_check_int(x) (((void)((x) == (opus_int32)0)), (opus_int32)(x))
#define __opus_check_int_ptr(ptr) ((ptr) + ((ptr) - (opus_int32*)(ptr)))
#define __opus_check_uint_ptr(ptr) ((ptr) + ((ptr) - (opus_uint32*)(ptr)))
#define __opus_check_val16_ptr(ptr) ((ptr) + ((ptr) - (opus_val16*)(ptr)))
/** @endcond */
/** @defgroup opus_ctlvalues Pre-defined values for CTL interface
* @see opus_genericctls, opus_encoderctls
* @{
*/
/* Values for the various encoder CTLs */
#define OPUS_AUTO -1000 /**<Auto/default setting @hideinitializer*/
#define OPUS_BITRATE_MAX -1 /**<Maximum bitrate @hideinitializer*/
/** Best for most VoIP/videoconference applications where listening quality and intelligibility matter most
* @hideinitializer */
#define OPUS_APPLICATION_VOIP 2048
/** Best for broadcast/high-fidelity application where the decoded audio should be as close as possible to the input
* @hideinitializer */
#define OPUS_APPLICATION_AUDIO 2049
/** Only use when lowest-achievable latency is what matters most. Voice-optimized modes cannot be used.
* @hideinitializer */
#define OPUS_APPLICATION_RESTRICTED_LOWDELAY 2051
#define OPUS_SIGNAL_VOICE 3001 /**< Signal being encoded is voice */
#define OPUS_SIGNAL_MUSIC 3002 /**< Signal being encoded is music */
#define OPUS_BANDWIDTH_NARROWBAND 1101 /**< 4 kHz bandpass @hideinitializer*/
#define OPUS_BANDWIDTH_MEDIUMBAND 1102 /**< 6 kHz bandpass @hideinitializer*/
#define OPUS_BANDWIDTH_WIDEBAND 1103 /**< 8 kHz bandpass @hideinitializer*/
#define OPUS_BANDWIDTH_SUPERWIDEBAND 1104 /**<12 kHz bandpass @hideinitializer*/
#define OPUS_BANDWIDTH_FULLBAND 1105 /**<20 kHz bandpass @hideinitializer*/
#define OPUS_FRAMESIZE_ARG 5000 /**< Select frame size from the argument (default) */
#define OPUS_FRAMESIZE_2_5_MS 5001 /**< Use 2.5 ms frames */
#define OPUS_FRAMESIZE_5_MS 5002 /**< Use 5 ms frames */
#define OPUS_FRAMESIZE_10_MS 5003 /**< Use 10 ms frames */
#define OPUS_FRAMESIZE_20_MS 5004 /**< Use 20 ms frames */
#define OPUS_FRAMESIZE_40_MS 5005 /**< Use 40 ms frames */
#define OPUS_FRAMESIZE_60_MS 5006 /**< Use 60 ms frames */
/**@}*/
/** @defgroup opus_encoderctls Encoder related CTLs
*
* These are convenience macros for use with the \c opus_encode_ctl
* interface. They are used to generate the appropriate series of
* arguments for that call, passing the correct type, size and so
* on as expected for each particular request.
*
* Some usage examples:
*
* @code
* int ret;
* ret = opus_encoder_ctl(enc_ctx, OPUS_SET_BANDWIDTH(OPUS_AUTO));
* if (ret != OPUS_OK) return ret;
*
* opus_int32 rate;
* opus_encoder_ctl(enc_ctx, OPUS_GET_BANDWIDTH(&rate));
*
* opus_encoder_ctl(enc_ctx, OPUS_RESET_STATE);
* @endcode
*
* @see opus_genericctls, opus_encoder
* @{
*/
/** Configures the encoder's computational complexity.
* The supported range is 0-10 inclusive with 10 representing the highest complexity.
* @see OPUS_GET_COMPLEXITY
* @param[in] x <tt>opus_int32</tt>: Allowed values: 0-10, inclusive.
*
* @hideinitializer */
#define OPUS_SET_COMPLEXITY(x) OPUS_SET_COMPLEXITY_REQUEST, __opus_check_int(x)
/** Gets the encoder's complexity configuration.
* @see OPUS_SET_COMPLEXITY
* @param[out] x <tt>opus_int32 *</tt>: Returns a value in the range 0-10,
* inclusive.
* @hideinitializer */
#define OPUS_GET_COMPLEXITY(x) OPUS_GET_COMPLEXITY_REQUEST, __opus_check_int_ptr(x)
/** Configures the bitrate in the encoder.
* Rates from 500 to 512000 bits per second are meaningful, as well as the
* special values #OPUS_AUTO and #OPUS_BITRATE_MAX.
* The value #OPUS_BITRATE_MAX can be used to cause the codec to use as much
* rate as it can, which is useful for controlling the rate by adjusting the
* output buffer size.
* @see OPUS_GET_BITRATE
* @param[in] x <tt>opus_int32</tt>: Bitrate in bits per second. The default
* is determined based on the number of
* channels and the input sampling rate.
* @hideinitializer */
#define OPUS_SET_BITRATE(x) OPUS_SET_BITRATE_REQUEST, __opus_check_int(x)
/** Gets the encoder's bitrate configuration.
* @see OPUS_SET_BITRATE
* @param[out] x <tt>opus_int32 *</tt>: Returns the bitrate in bits per second.
* The default is determined based on the
* number of channels and the input
* sampling rate.
* @hideinitializer */
#define OPUS_GET_BITRATE(x) OPUS_GET_BITRATE_REQUEST, __opus_check_int_ptr(x)
/** Enables or disables variable bitrate (VBR) in the encoder.
* The configured bitrate may not be met exactly because frames must
* be an integer number of bytes in length.
* @warning Only the MDCT mode of Opus can provide hard CBR behavior.
* @see OPUS_GET_VBR
* @see OPUS_SET_VBR_CONSTRAINT
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>0</dt><dd>Hard CBR. For LPC/hybrid modes at very low bit-rate, this can
* cause noticeable quality degradation.</dd>
* <dt>1</dt><dd>VBR (default). The exact type of VBR is controlled by
* #OPUS_SET_VBR_CONSTRAINT.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_VBR(x) OPUS_SET_VBR_REQUEST, __opus_check_int(x)
/** Determine if variable bitrate (VBR) is enabled in the encoder.
* @see OPUS_SET_VBR
* @see OPUS_GET_VBR_CONSTRAINT
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>0</dt><dd>Hard CBR.</dd>
* <dt>1</dt><dd>VBR (default). The exact type of VBR may be retrieved via
* #OPUS_GET_VBR_CONSTRAINT.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_VBR(x) OPUS_GET_VBR_REQUEST, __opus_check_int_ptr(x)
/** Enables or disables constrained VBR in the encoder.
* This setting is ignored when the encoder is in CBR mode.
* @warning Only the MDCT mode of Opus currently heeds the constraint.
* Speech mode ignores it completely, hybrid mode may fail to obey it
* if the LPC layer uses more bitrate than the constraint would have
* permitted.
* @see OPUS_GET_VBR_CONSTRAINT
* @see OPUS_SET_VBR
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>0</dt><dd>Unconstrained VBR.</dd>
* <dt>1</dt><dd>Constrained VBR (default). This creates a maximum of one
* frame of buffering delay assuming a transport with a
* serialization speed of the nominal bitrate.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_VBR_CONSTRAINT(x) OPUS_SET_VBR_CONSTRAINT_REQUEST, __opus_check_int(x)
/** Determine if constrained VBR is enabled in the encoder.
* @see OPUS_SET_VBR_CONSTRAINT
* @see OPUS_GET_VBR
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>0</dt><dd>Unconstrained VBR.</dd>
* <dt>1</dt><dd>Constrained VBR (default).</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_VBR_CONSTRAINT(x) OPUS_GET_VBR_CONSTRAINT_REQUEST, __opus_check_int_ptr(x)
/** Configures mono/stereo forcing in the encoder.
* This can force the encoder to produce packets encoded as either mono or
* stereo, regardless of the format of the input audio. This is useful when
* the caller knows that the input signal is currently a mono source embedded
* in a stereo stream.
* @see OPUS_GET_FORCE_CHANNELS
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>#OPUS_AUTO</dt><dd>Not forced (default)</dd>
* <dt>1</dt> <dd>Forced mono</dd>
* <dt>2</dt> <dd>Forced stereo</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_FORCE_CHANNELS(x) OPUS_SET_FORCE_CHANNELS_REQUEST, __opus_check_int(x)
/** Gets the encoder's forced channel configuration.
* @see OPUS_SET_FORCE_CHANNELS
* @param[out] x <tt>opus_int32 *</tt>:
* <dl>
* <dt>#OPUS_AUTO</dt><dd>Not forced (default)</dd>
* <dt>1</dt> <dd>Forced mono</dd>
* <dt>2</dt> <dd>Forced stereo</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_FORCE_CHANNELS(x) OPUS_GET_FORCE_CHANNELS_REQUEST, __opus_check_int_ptr(x)
/** Configures the maximum bandpass that the encoder will select automatically.
* Applications should normally use this instead of #OPUS_SET_BANDWIDTH
* (leaving that set to the default, #OPUS_AUTO). This allows the
* application to set an upper bound based on the type of input it is
* providing, but still gives the encoder the freedom to reduce the bandpass
* when the bitrate becomes too low, for better overall quality.
* @see OPUS_GET_MAX_BANDWIDTH
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>OPUS_BANDWIDTH_NARROWBAND</dt> <dd>4 kHz passband</dd>
* <dt>OPUS_BANDWIDTH_MEDIUMBAND</dt> <dd>6 kHz passband</dd>
* <dt>OPUS_BANDWIDTH_WIDEBAND</dt> <dd>8 kHz passband</dd>
* <dt>OPUS_BANDWIDTH_SUPERWIDEBAND</dt><dd>12 kHz passband</dd>
* <dt>OPUS_BANDWIDTH_FULLBAND</dt> <dd>20 kHz passband (default)</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_MAX_BANDWIDTH(x) OPUS_SET_MAX_BANDWIDTH_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured maximum allowed bandpass.
* @see OPUS_SET_MAX_BANDWIDTH
* @param[out] x <tt>opus_int32 *</tt>: Allowed values:
* <dl>
* <dt>#OPUS_BANDWIDTH_NARROWBAND</dt> <dd>4 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_MEDIUMBAND</dt> <dd>6 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_WIDEBAND</dt> <dd>8 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_SUPERWIDEBAND</dt><dd>12 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_FULLBAND</dt> <dd>20 kHz passband (default)</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_MAX_BANDWIDTH(x) OPUS_GET_MAX_BANDWIDTH_REQUEST, __opus_check_int_ptr(x)
/** Sets the encoder's bandpass to a specific value.
* This prevents the encoder from automatically selecting the bandpass based
* on the available bitrate. If an application knows the bandpass of the input
* audio it is providing, it should normally use #OPUS_SET_MAX_BANDWIDTH
* instead, which still gives the encoder the freedom to reduce the bandpass
* when the bitrate becomes too low, for better overall quality.
* @see OPUS_GET_BANDWIDTH
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>#OPUS_AUTO</dt> <dd>(default)</dd>
* <dt>#OPUS_BANDWIDTH_NARROWBAND</dt> <dd>4 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_MEDIUMBAND</dt> <dd>6 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_WIDEBAND</dt> <dd>8 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_SUPERWIDEBAND</dt><dd>12 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_FULLBAND</dt> <dd>20 kHz passband</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_BANDWIDTH(x) OPUS_SET_BANDWIDTH_REQUEST, __opus_check_int(x)
/** Configures the type of signal being encoded.
* This is a hint which helps the encoder's mode selection.
* @see OPUS_GET_SIGNAL
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>#OPUS_AUTO</dt> <dd>(default)</dd>
* <dt>#OPUS_SIGNAL_VOICE</dt><dd>Bias thresholds towards choosing LPC or Hybrid modes.</dd>
* <dt>#OPUS_SIGNAL_MUSIC</dt><dd>Bias thresholds towards choosing MDCT modes.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_SIGNAL(x) OPUS_SET_SIGNAL_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured signal type.
* @see OPUS_SET_SIGNAL
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>#OPUS_AUTO</dt> <dd>(default)</dd>
* <dt>#OPUS_SIGNAL_VOICE</dt><dd>Bias thresholds towards choosing LPC or Hybrid modes.</dd>
* <dt>#OPUS_SIGNAL_MUSIC</dt><dd>Bias thresholds towards choosing MDCT modes.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_SIGNAL(x) OPUS_GET_SIGNAL_REQUEST, __opus_check_int_ptr(x)
/** Configures the encoder's intended application.
* The initial value is a mandatory argument to the encoder_create function.
* @see OPUS_GET_APPLICATION
* @param[in] x <tt>opus_int32</tt>: Returns one of the following values:
* <dl>
* <dt>#OPUS_APPLICATION_VOIP</dt>
* <dd>Process signal for improved speech intelligibility.</dd>
* <dt>#OPUS_APPLICATION_AUDIO</dt>
* <dd>Favor faithfulness to the original input.</dd>
* <dt>#OPUS_APPLICATION_RESTRICTED_LOWDELAY</dt>
* <dd>Configure the minimum possible coding delay by disabling certain modes
* of operation.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_APPLICATION(x) OPUS_SET_APPLICATION_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured application.
* @see OPUS_SET_APPLICATION
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>#OPUS_APPLICATION_VOIP</dt>
* <dd>Process signal for improved speech intelligibility.</dd>
* <dt>#OPUS_APPLICATION_AUDIO</dt>
* <dd>Favor faithfulness to the original input.</dd>
* <dt>#OPUS_APPLICATION_RESTRICTED_LOWDELAY</dt>
* <dd>Configure the minimum possible coding delay by disabling certain modes
* of operation.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_APPLICATION(x) OPUS_GET_APPLICATION_REQUEST, __opus_check_int_ptr(x)
/** Gets the total samples of delay added by the entire codec.
* This can be queried by the encoder and then the provided number of samples can be
* skipped on from the start of the decoder's output to provide time aligned input
* and output. From the perspective of a decoding application the real data begins this many
* samples late.
*
* The decoder contribution to this delay is identical for all decoders, but the
* encoder portion of the delay may vary from implementation to implementation,
* version to version, or even depend on the encoder's initial configuration.
* Applications needing delay compensation should call this CTL rather than
* hard-coding a value.
* @param[out] x <tt>opus_int32 *</tt>: Number of lookahead samples
* @hideinitializer */
#define OPUS_GET_LOOKAHEAD(x) OPUS_GET_LOOKAHEAD_REQUEST, __opus_check_int_ptr(x)
/** Configures the encoder's use of inband forward error correction (FEC).
* @note This is only applicable to the LPC layer
* @see OPUS_GET_INBAND_FEC
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>0</dt><dd>Disable inband FEC (default).</dd>
* <dt>1</dt><dd>Enable inband FEC.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_INBAND_FEC(x) OPUS_SET_INBAND_FEC_REQUEST, __opus_check_int(x)
/** Gets encoder's configured use of inband forward error correction.
* @see OPUS_SET_INBAND_FEC
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>0</dt><dd>Inband FEC disabled (default).</dd>
* <dt>1</dt><dd>Inband FEC enabled.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_INBAND_FEC(x) OPUS_GET_INBAND_FEC_REQUEST, __opus_check_int_ptr(x)
/** Configures the encoder's expected packet loss percentage.
* Higher values with trigger progressively more loss resistant behavior in the encoder
* at the expense of quality at a given bitrate in the lossless case, but greater quality
* under loss.
* @see OPUS_GET_PACKET_LOSS_PERC
* @param[in] x <tt>opus_int32</tt>: Loss percentage in the range 0-100, inclusive (default: 0).
* @hideinitializer */
#define OPUS_SET_PACKET_LOSS_PERC(x) OPUS_SET_PACKET_LOSS_PERC_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured packet loss percentage.
* @see OPUS_SET_PACKET_LOSS_PERC
* @param[out] x <tt>opus_int32 *</tt>: Returns the configured loss percentage
* in the range 0-100, inclusive (default: 0).
* @hideinitializer */
#define OPUS_GET_PACKET_LOSS_PERC(x) OPUS_GET_PACKET_LOSS_PERC_REQUEST, __opus_check_int_ptr(x)
/** Configures the encoder's use of discontinuous transmission (DTX).
* @note This is only applicable to the LPC layer
* @see OPUS_GET_DTX
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>0</dt><dd>Disable DTX (default).</dd>
* <dt>1</dt><dd>Enabled DTX.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_DTX(x) OPUS_SET_DTX_REQUEST, __opus_check_int(x)
/** Gets encoder's configured use of discontinuous transmission.
* @see OPUS_SET_DTX
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>0</dt><dd>DTX disabled (default).</dd>
* <dt>1</dt><dd>DTX enabled.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_DTX(x) OPUS_GET_DTX_REQUEST, __opus_check_int_ptr(x)
/** Configures the depth of signal being encoded.
* This is a hint which helps the encoder identify silence and near-silence.
* @see OPUS_GET_LSB_DEPTH
* @param[in] x <tt>opus_int32</tt>: Input precision in bits, between 8 and 24
* (default: 24).
* @hideinitializer */
#define OPUS_SET_LSB_DEPTH(x) OPUS_SET_LSB_DEPTH_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured signal depth.
* @see OPUS_SET_LSB_DEPTH
* @param[out] x <tt>opus_int32 *</tt>: Input precision in bits, between 8 and
* 24 (default: 24).
* @hideinitializer */
#define OPUS_GET_LSB_DEPTH(x) OPUS_GET_LSB_DEPTH_REQUEST, __opus_check_int_ptr(x)
/** Configures the encoder's use of variable duration frames.
* When variable duration is enabled, the encoder is free to use a shorter frame
* size than the one requested in the opus_encode*() call.
* It is then the user's responsibility
* to verify how much audio was encoded by checking the ToC byte of the encoded
* packet. The part of the audio that was not encoded needs to be resent to the
* encoder for the next call. Do not use this option unless you <b>really</b>
* know what you are doing.
* @see OPUS_GET_EXPERT_VARIABLE_DURATION
* @param[in] x <tt>opus_int32</tt>: Allowed values:
* <dl>
* <dt>OPUS_FRAMESIZE_ARG</dt><dd>Select frame size from the argument (default).</dd>
* <dt>OPUS_FRAMESIZE_2_5_MS</dt><dd>Use 2.5 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_5_MS</dt><dd>Use 2.5 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_10_MS</dt><dd>Use 10 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_20_MS</dt><dd>Use 20 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_40_MS</dt><dd>Use 40 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_60_MS</dt><dd>Use 60 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_VARIABLE</dt><dd>Optimize the frame size dynamically.</dd>
* </dl>
* @hideinitializer */
#define OPUS_SET_EXPERT_FRAME_DURATION(x) OPUS_SET_EXPERT_FRAME_DURATION_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured use of variable duration frames.
* @see OPUS_SET_EXPERT_VARIABLE_DURATION
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>OPUS_FRAMESIZE_ARG</dt><dd>Select frame size from the argument (default).</dd>
* <dt>OPUS_FRAMESIZE_2_5_MS</dt><dd>Use 2.5 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_5_MS</dt><dd>Use 2.5 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_10_MS</dt><dd>Use 10 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_20_MS</dt><dd>Use 20 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_40_MS</dt><dd>Use 40 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_60_MS</dt><dd>Use 60 ms frames.</dd>
* <dt>OPUS_FRAMESIZE_VARIABLE</dt><dd>Optimize the frame size dynamically.</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_EXPERT_FRAME_DURATION(x) OPUS_GET_EXPERT_FRAME_DURATION_REQUEST, __opus_check_int_ptr(x)
/** If set to 1, disables almost all use of prediction, making frames almost
completely independent. This reduces quality. (default : 0)
* @hideinitializer */
#define OPUS_SET_PREDICTION_DISABLED(x) OPUS_SET_PREDICTION_DISABLED_REQUEST, __opus_check_int(x)
/** Gets the encoder's configured prediction status.
* @hideinitializer */
#define OPUS_GET_PREDICTION_DISABLED(x) OPUS_GET_PREDICTION_DISABLED_REQUEST, __opus_check_int_ptr(x)
/**@}*/
/** @defgroup opus_genericctls Generic CTLs
*
* These macros are used with the \c opus_decoder_ctl and
* \c opus_encoder_ctl calls to generate a particular
* request.
*
* When called on an \c OpusDecoder they apply to that
* particular decoder instance. When called on an
* \c OpusEncoder they apply to the corresponding setting
* on that encoder instance, if present.
*
* Some usage examples:
*
* @code
* int ret;
* opus_int32 pitch;
* ret = opus_decoder_ctl(dec_ctx, OPUS_GET_PITCH(&pitch));
* if (ret == OPUS_OK) return ret;
*
* opus_encoder_ctl(enc_ctx, OPUS_RESET_STATE);
* opus_decoder_ctl(dec_ctx, OPUS_RESET_STATE);
*
* opus_int32 enc_bw, dec_bw;
* opus_encoder_ctl(enc_ctx, OPUS_GET_BANDWIDTH(&enc_bw));
* opus_decoder_ctl(dec_ctx, OPUS_GET_BANDWIDTH(&dec_bw));
* if (enc_bw != dec_bw) {
* printf("packet bandwidth mismatch!\n");
* }
* @endcode
*
* @see opus_encoder, opus_decoder_ctl, opus_encoder_ctl, opus_decoderctls, opus_encoderctls
* @{
*/
/** Resets the codec state to be equivalent to a freshly initialized state.
* This should be called when switching streams in order to prevent
* the back to back decoding from giving different results from
* one at a time decoding.
* @hideinitializer */
#define OPUS_RESET_STATE 4028
/** Gets the final state of the codec's entropy coder.
* This is used for testing purposes,
* The encoder and decoder state should be identical after coding a payload
* (assuming no data corruption or software bugs)
*
* @param[out] x <tt>opus_uint32 *</tt>: Entropy coder state
*
* @hideinitializer */
#define OPUS_GET_FINAL_RANGE(x) OPUS_GET_FINAL_RANGE_REQUEST, __opus_check_uint_ptr(x)
/** Gets the encoder's configured bandpass or the decoder's last bandpass.
* @see OPUS_SET_BANDWIDTH
* @param[out] x <tt>opus_int32 *</tt>: Returns one of the following values:
* <dl>
* <dt>#OPUS_AUTO</dt> <dd>(default)</dd>
* <dt>#OPUS_BANDWIDTH_NARROWBAND</dt> <dd>4 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_MEDIUMBAND</dt> <dd>6 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_WIDEBAND</dt> <dd>8 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_SUPERWIDEBAND</dt><dd>12 kHz passband</dd>
* <dt>#OPUS_BANDWIDTH_FULLBAND</dt> <dd>20 kHz passband</dd>
* </dl>
* @hideinitializer */
#define OPUS_GET_BANDWIDTH(x) OPUS_GET_BANDWIDTH_REQUEST, __opus_check_int_ptr(x)
/** Gets the sampling rate the encoder or decoder was initialized with.
* This simply returns the <code>Fs</code> value passed to opus_encoder_init()
* or opus_decoder_init().
* @param[out] x <tt>opus_int32 *</tt>: Sampling rate of encoder or decoder.
* @hideinitializer
*/
#define OPUS_GET_SAMPLE_RATE(x) OPUS_GET_SAMPLE_RATE_REQUEST, __opus_check_int_ptr(x)
/**@}*/
/** @defgroup opus_decoderctls Decoder related CTLs
* @see opus_genericctls, opus_encoderctls, opus_decoder
* @{
*/
/** Configures decoder gain adjustment.
* Scales the decoded output by a factor specified in Q8 dB units.
* This has a maximum range of -32768 to 32767 inclusive, and returns
* OPUS_BAD_ARG otherwise. The default is zero indicating no adjustment.
* This setting survives decoder reset.
*
* gain = pow(10, x/(20.0*256))
*
* @param[in] x <tt>opus_int32</tt>: Amount to scale PCM signal by in Q8 dB units.
* @hideinitializer */
#define OPUS_SET_GAIN(x) OPUS_SET_GAIN_REQUEST, __opus_check_int(x)
/** Gets the decoder's configured gain adjustment. @see OPUS_SET_GAIN
*
* @param[out] x <tt>opus_int32 *</tt>: Amount to scale PCM signal by in Q8 dB units.
* @hideinitializer */
#define OPUS_GET_GAIN(x) OPUS_GET_GAIN_REQUEST, __opus_check_int_ptr(x)
/** Gets the duration (in samples) of the last packet successfully decoded or concealed.
* @param[out] x <tt>opus_int32 *</tt>: Number of samples (at current sampling rate).
* @hideinitializer */
#define OPUS_GET_LAST_PACKET_DURATION(x) OPUS_GET_LAST_PACKET_DURATION_REQUEST, __opus_check_int_ptr(x)
/** Gets the pitch of the last decoded frame, if available.
* This can be used for any post-processing algorithm requiring the use of pitch,
* e.g. time stretching/shortening. If the last frame was not voiced, or if the
* pitch was not coded in the frame, then zero is returned.
*
* This CTL is only implemented for decoder instances.
*
* @param[out] x <tt>opus_int32 *</tt>: pitch period at 48 kHz (or 0 if not available)
*
* @hideinitializer */
#define OPUS_GET_PITCH(x) OPUS_GET_PITCH_REQUEST, __opus_check_int_ptr(x)
/**@}*/
/** @defgroup opus_libinfo Opus library information functions
* @{
*/
/** Converts an opus error code into a human readable string.
*
* @param[in] error <tt>int</tt>: Error number
* @returns Error string
*/
OPUS_EXPORT const char *opus_strerror(int error);
/** Gets the libopus version string.
*
* @returns Version string
*/
OPUS_EXPORT const char *opus_get_version_string(void);
/**@}*/
#ifdef __cplusplus
}
#endif
#endif /* OPUS_DEFINES_H */

View File

@@ -1,660 +0,0 @@
/* Copyright (c) 2011 Xiph.Org Foundation
Written by Jean-Marc Valin */
/*
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**
* @file opus_multistream.h
* @brief Opus reference implementation multistream API
*/
#ifndef OPUS_MULTISTREAM_H
#define OPUS_MULTISTREAM_H
#include "opus.h"
#ifdef __cplusplus
extern "C" {
#endif
/** @cond OPUS_INTERNAL_DOC */
/** Macros to trigger compilation errors when the wrong types are provided to a
* CTL. */
/**@{*/
#define __opus_check_encstate_ptr(ptr) ((ptr) + ((ptr) - (OpusEncoder**)(ptr)))
#define __opus_check_decstate_ptr(ptr) ((ptr) + ((ptr) - (OpusDecoder**)(ptr)))
/**@}*/
/** These are the actual encoder and decoder CTL ID numbers.
* They should not be used directly by applications.
* In general, SETs should be even and GETs should be odd.*/
/**@{*/
#define OPUS_MULTISTREAM_GET_ENCODER_STATE_REQUEST 5120
#define OPUS_MULTISTREAM_GET_DECODER_STATE_REQUEST 5122
/**@}*/
/** @endcond */
/** @defgroup opus_multistream_ctls Multistream specific encoder and decoder CTLs
*
* These are convenience macros that are specific to the
* opus_multistream_encoder_ctl() and opus_multistream_decoder_ctl()
* interface.
* The CTLs from @ref opus_genericctls, @ref opus_encoderctls, and
* @ref opus_decoderctls may be applied to a multistream encoder or decoder as
* well.
* In addition, you may retrieve the encoder or decoder state for an specific
* stream via #OPUS_MULTISTREAM_GET_ENCODER_STATE or
* #OPUS_MULTISTREAM_GET_DECODER_STATE and apply CTLs to it individually.
*/
/**@{*/
/** Gets the encoder state for an individual stream of a multistream encoder.
* @param[in] x <tt>opus_int32</tt>: The index of the stream whose encoder you
* wish to retrieve.
* This must be non-negative and less than
* the <code>streams</code> parameter used
* to initialize the encoder.
* @param[out] y <tt>OpusEncoder**</tt>: Returns a pointer to the given
* encoder state.
* @retval OPUS_BAD_ARG The index of the requested stream was out of range.
* @hideinitializer
*/
#define OPUS_MULTISTREAM_GET_ENCODER_STATE(x,y) OPUS_MULTISTREAM_GET_ENCODER_STATE_REQUEST, __opus_check_int(x), __opus_check_encstate_ptr(y)
/** Gets the decoder state for an individual stream of a multistream decoder.
* @param[in] x <tt>opus_int32</tt>: The index of the stream whose decoder you
* wish to retrieve.
* This must be non-negative and less than
* the <code>streams</code> parameter used
* to initialize the decoder.
* @param[out] y <tt>OpusDecoder**</tt>: Returns a pointer to the given
* decoder state.
* @retval OPUS_BAD_ARG The index of the requested stream was out of range.
* @hideinitializer
*/
#define OPUS_MULTISTREAM_GET_DECODER_STATE(x,y) OPUS_MULTISTREAM_GET_DECODER_STATE_REQUEST, __opus_check_int(x), __opus_check_decstate_ptr(y)
/**@}*/
/** @defgroup opus_multistream Opus Multistream API
* @{
*
* The multistream API allows individual Opus streams to be combined into a
* single packet, enabling support for up to 255 channels. Unlike an
* elementary Opus stream, the encoder and decoder must negotiate the channel
* configuration before the decoder can successfully interpret the data in the
* packets produced by the encoder. Some basic information, such as packet
* duration, can be computed without any special negotiation.
*
* The format for multistream Opus packets is defined in the
* <a href="http://tools.ietf.org/html/draft-terriberry-oggopus">Ogg
* encapsulation specification</a> and is based on the self-delimited Opus
* framing described in Appendix B of <a href="http://tools.ietf.org/html/rfc6716">RFC 6716</a>.
* Normal Opus packets are just a degenerate case of multistream Opus packets,
* and can be encoded or decoded with the multistream API by setting
* <code>streams</code> to <code>1</code> when initializing the encoder or
* decoder.
*
* Multistream Opus streams can contain up to 255 elementary Opus streams.
* These may be either "uncoupled" or "coupled", indicating that the decoder
* is configured to decode them to either 1 or 2 channels, respectively.
* The streams are ordered so that all coupled streams appear at the
* beginning.
*
* A <code>mapping</code> table defines which decoded channel <code>i</code>
* should be used for each input/output (I/O) channel <code>j</code>. This table is
* typically provided as an unsigned char array.
* Let <code>i = mapping[j]</code> be the index for I/O channel <code>j</code>.
* If <code>i < 2*coupled_streams</code>, then I/O channel <code>j</code> is
* encoded as the left channel of stream <code>(i/2)</code> if <code>i</code>
* is even, or as the right channel of stream <code>(i/2)</code> if
* <code>i</code> is odd. Otherwise, I/O channel <code>j</code> is encoded as
* mono in stream <code>(i - coupled_streams)</code>, unless it has the special
* value 255, in which case it is omitted from the encoding entirely (the
* decoder will reproduce it as silence). Each value <code>i</code> must either
* be the special value 255 or be less than <code>streams + coupled_streams</code>.
*
* The output channels specified by the encoder
* should use the
* <a href="http://www.xiph.org/vorbis/doc/Vorbis_I_spec.html#x1-800004.3.9">Vorbis
* channel ordering</a>. A decoder may wish to apply an additional permutation
* to the mapping the encoder used to achieve a different output channel
* order (e.g. for outputing in WAV order).
*
* Each multistream packet contains an Opus packet for each stream, and all of
* the Opus packets in a single multistream packet must have the same
* duration. Therefore the duration of a multistream packet can be extracted
* from the TOC sequence of the first stream, which is located at the
* beginning of the packet, just like an elementary Opus stream:
*
* @code
* int nb_samples;
* int nb_frames;
* nb_frames = opus_packet_get_nb_frames(data, len);
* if (nb_frames < 1)
* return nb_frames;
* nb_samples = opus_packet_get_samples_per_frame(data, 48000) * nb_frames;
* @endcode
*
* The general encoding and decoding process proceeds exactly the same as in
* the normal @ref opus_encoder and @ref opus_decoder APIs.
* See their documentation for an overview of how to use the corresponding
* multistream functions.
*/
/** Opus multistream encoder state.
* This contains the complete state of a multistream Opus encoder.
* It is position independent and can be freely copied.
* @see opus_multistream_encoder_create
* @see opus_multistream_encoder_init
*/
typedef struct OpusMSEncoder OpusMSEncoder;
/** Opus multistream decoder state.
* This contains the complete state of a multistream Opus decoder.
* It is position independent and can be freely copied.
* @see opus_multistream_decoder_create
* @see opus_multistream_decoder_init
*/
typedef struct OpusMSDecoder OpusMSDecoder;
/**\name Multistream encoder functions */
/**@{*/
/** Gets the size of an OpusMSEncoder structure.
* @param streams <tt>int</tt>: The total number of streams to encode from the
* input.
* This must be no more than 255.
* @param coupled_streams <tt>int</tt>: Number of coupled (2 channel) streams
* to encode.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* encoded channels (<code>streams +
* coupled_streams</code>) must be no
* more than 255.
* @returns The size in bytes on success, or a negative error code
* (see @ref opus_errorcodes) on error.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_multistream_encoder_get_size(
int streams,
int coupled_streams
);
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_multistream_surround_encoder_get_size(
int channels,
int mapping_family
);
/** Allocates and initializes a multistream encoder state.
* Call opus_multistream_encoder_destroy() to release
* this object when finished.
* @param Fs <tt>opus_int32</tt>: Sampling rate of the input signal (in Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param channels <tt>int</tt>: Number of channels in the input signal.
* This must be at most 255.
* It may be greater than the number of
* coded channels (<code>streams +
* coupled_streams</code>).
* @param streams <tt>int</tt>: The total number of streams to encode from the
* input.
* This must be no more than the number of channels.
* @param coupled_streams <tt>int</tt>: Number of coupled (2 channel) streams
* to encode.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* encoded channels (<code>streams +
* coupled_streams</code>) must be no
* more than the number of input channels.
* @param[in] mapping <code>const unsigned char[channels]</code>: Mapping from
* encoded channels to input channels, as described in
* @ref opus_multistream. As an extra constraint, the
* multistream encoder does not allow encoding coupled
* streams for which one channel is unused since this
* is never a good idea.
* @param application <tt>int</tt>: The target encoder application.
* This must be one of the following:
* <dl>
* <dt>#OPUS_APPLICATION_VOIP</dt>
* <dd>Process signal for improved speech intelligibility.</dd>
* <dt>#OPUS_APPLICATION_AUDIO</dt>
* <dd>Favor faithfulness to the original input.</dd>
* <dt>#OPUS_APPLICATION_RESTRICTED_LOWDELAY</dt>
* <dd>Configure the minimum possible coding delay by disabling certain modes
* of operation.</dd>
* </dl>
* @param[out] error <tt>int *</tt>: Returns #OPUS_OK on success, or an error
* code (see @ref opus_errorcodes) on
* failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusMSEncoder *opus_multistream_encoder_create(
opus_int32 Fs,
int channels,
int streams,
int coupled_streams,
const unsigned char *mapping,
int application,
int *error
) OPUS_ARG_NONNULL(5);
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusMSEncoder *opus_multistream_surround_encoder_create(
opus_int32 Fs,
int channels,
int mapping_family,
int *streams,
int *coupled_streams,
unsigned char *mapping,
int application,
int *error
) OPUS_ARG_NONNULL(5);
/** Initialize a previously allocated multistream encoder state.
* The memory pointed to by \a st must be at least the size returned by
* opus_multistream_encoder_get_size().
* This is intended for applications which use their own allocator instead of
* malloc.
* To reset a previously initialized state, use the #OPUS_RESET_STATE CTL.
* @see opus_multistream_encoder_create
* @see opus_multistream_encoder_get_size
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state to initialize.
* @param Fs <tt>opus_int32</tt>: Sampling rate of the input signal (in Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param channels <tt>int</tt>: Number of channels in the input signal.
* This must be at most 255.
* It may be greater than the number of
* coded channels (<code>streams +
* coupled_streams</code>).
* @param streams <tt>int</tt>: The total number of streams to encode from the
* input.
* This must be no more than the number of channels.
* @param coupled_streams <tt>int</tt>: Number of coupled (2 channel) streams
* to encode.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* encoded channels (<code>streams +
* coupled_streams</code>) must be no
* more than the number of input channels.
* @param[in] mapping <code>const unsigned char[channels]</code>: Mapping from
* encoded channels to input channels, as described in
* @ref opus_multistream. As an extra constraint, the
* multistream encoder does not allow encoding coupled
* streams for which one channel is unused since this
* is never a good idea.
* @param application <tt>int</tt>: The target encoder application.
* This must be one of the following:
* <dl>
* <dt>#OPUS_APPLICATION_VOIP</dt>
* <dd>Process signal for improved speech intelligibility.</dd>
* <dt>#OPUS_APPLICATION_AUDIO</dt>
* <dd>Favor faithfulness to the original input.</dd>
* <dt>#OPUS_APPLICATION_RESTRICTED_LOWDELAY</dt>
* <dd>Configure the minimum possible coding delay by disabling certain modes
* of operation.</dd>
* </dl>
* @returns #OPUS_OK on success, or an error code (see @ref opus_errorcodes)
* on failure.
*/
OPUS_EXPORT int opus_multistream_encoder_init(
OpusMSEncoder *st,
opus_int32 Fs,
int channels,
int streams,
int coupled_streams,
const unsigned char *mapping,
int application
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(6);
OPUS_EXPORT int opus_multistream_surround_encoder_init(
OpusMSEncoder *st,
opus_int32 Fs,
int channels,
int mapping_family,
int *streams,
int *coupled_streams,
unsigned char *mapping,
int application
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(6);
/** Encodes a multistream Opus frame.
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state.
* @param[in] pcm <tt>const opus_int16*</tt>: The input signal as interleaved
* samples.
* This must contain
* <code>frame_size*channels</code>
* samples.
* @param frame_size <tt>int</tt>: Number of samples per channel in the input
* signal.
* This must be an Opus frame size for the
* encoder's sampling rate.
* For example, at 48 kHz the permitted values
* are 120, 240, 480, 960, 1920, and 2880.
* Passing in a duration of less than 10 ms
* (480 samples at 48 kHz) will prevent the
* encoder from using the LPC or hybrid modes.
* @param[out] data <tt>unsigned char*</tt>: Output payload.
* This must contain storage for at
* least \a max_data_bytes.
* @param [in] max_data_bytes <tt>opus_int32</tt>: Size of the allocated
* memory for the output
* payload. This may be
* used to impose an upper limit on
* the instant bitrate, but should
* not be used as the only bitrate
* control. Use #OPUS_SET_BITRATE to
* control the bitrate.
* @returns The length of the encoded packet (in bytes) on success or a
* negative error code (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_multistream_encode(
OpusMSEncoder *st,
const opus_int16 *pcm,
int frame_size,
unsigned char *data,
opus_int32 max_data_bytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Encodes a multistream Opus frame from floating point input.
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state.
* @param[in] pcm <tt>const float*</tt>: The input signal as interleaved
* samples with a normal range of
* +/-1.0.
* Samples with a range beyond +/-1.0
* are supported but will be clipped by
* decoders using the integer API and
* should only be used if it is known
* that the far end supports extended
* dynamic range.
* This must contain
* <code>frame_size*channels</code>
* samples.
* @param frame_size <tt>int</tt>: Number of samples per channel in the input
* signal.
* This must be an Opus frame size for the
* encoder's sampling rate.
* For example, at 48 kHz the permitted values
* are 120, 240, 480, 960, 1920, and 2880.
* Passing in a duration of less than 10 ms
* (480 samples at 48 kHz) will prevent the
* encoder from using the LPC or hybrid modes.
* @param[out] data <tt>unsigned char*</tt>: Output payload.
* This must contain storage for at
* least \a max_data_bytes.
* @param [in] max_data_bytes <tt>opus_int32</tt>: Size of the allocated
* memory for the output
* payload. This may be
* used to impose an upper limit on
* the instant bitrate, but should
* not be used as the only bitrate
* control. Use #OPUS_SET_BITRATE to
* control the bitrate.
* @returns The length of the encoded packet (in bytes) on success or a
* negative error code (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_multistream_encode_float(
OpusMSEncoder *st,
const float *pcm,
int frame_size,
unsigned char *data,
opus_int32 max_data_bytes
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(2) OPUS_ARG_NONNULL(4);
/** Frees an <code>OpusMSEncoder</code> allocated by
* opus_multistream_encoder_create().
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state to be freed.
*/
OPUS_EXPORT void opus_multistream_encoder_destroy(OpusMSEncoder *st);
/** Perform a CTL function on a multistream Opus encoder.
*
* Generally the request and subsequent arguments are generated by a
* convenience macro.
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state.
* @param request This and all remaining parameters should be replaced by one
* of the convenience macros in @ref opus_genericctls,
* @ref opus_encoderctls, or @ref opus_multistream_ctls.
* @see opus_genericctls
* @see opus_encoderctls
* @see opus_multistream_ctls
*/
OPUS_EXPORT int opus_multistream_encoder_ctl(OpusMSEncoder *st, int request, ...) OPUS_ARG_NONNULL(1);
/**@}*/
/**\name Multistream decoder functions */
/**@{*/
/** Gets the size of an <code>OpusMSDecoder</code> structure.
* @param streams <tt>int</tt>: The total number of streams coded in the
* input.
* This must be no more than 255.
* @param coupled_streams <tt>int</tt>: Number streams to decode as coupled
* (2 channel) streams.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* coded channels (<code>streams +
* coupled_streams</code>) must be no
* more than 255.
* @returns The size in bytes on success, or a negative error code
* (see @ref opus_errorcodes) on error.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT opus_int32 opus_multistream_decoder_get_size(
int streams,
int coupled_streams
);
/** Allocates and initializes a multistream decoder state.
* Call opus_multistream_decoder_destroy() to release
* this object when finished.
* @param Fs <tt>opus_int32</tt>: Sampling rate to decode at (in Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param channels <tt>int</tt>: Number of channels to output.
* This must be at most 255.
* It may be different from the number of coded
* channels (<code>streams +
* coupled_streams</code>).
* @param streams <tt>int</tt>: The total number of streams coded in the
* input.
* This must be no more than 255.
* @param coupled_streams <tt>int</tt>: Number of streams to decode as coupled
* (2 channel) streams.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* coded channels (<code>streams +
* coupled_streams</code>) must be no
* more than 255.
* @param[in] mapping <code>const unsigned char[channels]</code>: Mapping from
* coded channels to output channels, as described in
* @ref opus_multistream.
* @param[out] error <tt>int *</tt>: Returns #OPUS_OK on success, or an error
* code (see @ref opus_errorcodes) on
* failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT OpusMSDecoder *opus_multistream_decoder_create(
opus_int32 Fs,
int channels,
int streams,
int coupled_streams,
const unsigned char *mapping,
int *error
) OPUS_ARG_NONNULL(5);
/** Intialize a previously allocated decoder state object.
* The memory pointed to by \a st must be at least the size returned by
* opus_multistream_encoder_get_size().
* This is intended for applications which use their own allocator instead of
* malloc.
* To reset a previously initialized state, use the #OPUS_RESET_STATE CTL.
* @see opus_multistream_decoder_create
* @see opus_multistream_deocder_get_size
* @param st <tt>OpusMSEncoder*</tt>: Multistream encoder state to initialize.
* @param Fs <tt>opus_int32</tt>: Sampling rate to decode at (in Hz).
* This must be one of 8000, 12000, 16000,
* 24000, or 48000.
* @param channels <tt>int</tt>: Number of channels to output.
* This must be at most 255.
* It may be different from the number of coded
* channels (<code>streams +
* coupled_streams</code>).
* @param streams <tt>int</tt>: The total number of streams coded in the
* input.
* This must be no more than 255.
* @param coupled_streams <tt>int</tt>: Number of streams to decode as coupled
* (2 channel) streams.
* This must be no larger than the total
* number of streams.
* Additionally, The total number of
* coded channels (<code>streams +
* coupled_streams</code>) must be no
* more than 255.
* @param[in] mapping <code>const unsigned char[channels]</code>: Mapping from
* coded channels to output channels, as described in
* @ref opus_multistream.
* @returns #OPUS_OK on success, or an error code (see @ref opus_errorcodes)
* on failure.
*/
OPUS_EXPORT int opus_multistream_decoder_init(
OpusMSDecoder *st,
opus_int32 Fs,
int channels,
int streams,
int coupled_streams,
const unsigned char *mapping
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(6);
/** Decode a multistream Opus packet.
* @param st <tt>OpusMSDecoder*</tt>: Multistream decoder state.
* @param[in] data <tt>const unsigned char*</tt>: Input payload.
* Use a <code>NULL</code>
* pointer to indicate packet
* loss.
* @param len <tt>opus_int32</tt>: Number of bytes in payload.
* @param[out] pcm <tt>opus_int16*</tt>: Output signal, with interleaved
* samples.
* This must contain room for
* <code>frame_size*channels</code>
* samples.
* @param frame_size <tt>int</tt>: The number of samples per channel of
* available space in \a pcm.
* If this is less than the maximum packet duration
* (120 ms; 5760 for 48kHz), this function will not be capable
* of decoding some packets. In the case of PLC (data==NULL)
* or FEC (decode_fec=1), then frame_size needs to be exactly
* the duration of audio that is missing, otherwise the
* decoder will not be in the optimal state to decode the
* next incoming packet. For the PLC and FEC cases, frame_size
* <b>must</b> be a multiple of 2.5 ms.
* @param decode_fec <tt>int</tt>: Flag (0 or 1) to request that any in-band
* forward error correction data be decoded.
* If no such data is available, the frame is
* decoded as if it were lost.
* @returns Number of samples decoded on success or a negative error code
* (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_multistream_decode(
OpusMSDecoder *st,
const unsigned char *data,
opus_int32 len,
opus_int16 *pcm,
int frame_size,
int decode_fec
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Decode a multistream Opus packet with floating point output.
* @param st <tt>OpusMSDecoder*</tt>: Multistream decoder state.
* @param[in] data <tt>const unsigned char*</tt>: Input payload.
* Use a <code>NULL</code>
* pointer to indicate packet
* loss.
* @param len <tt>opus_int32</tt>: Number of bytes in payload.
* @param[out] pcm <tt>opus_int16*</tt>: Output signal, with interleaved
* samples.
* This must contain room for
* <code>frame_size*channels</code>
* samples.
* @param frame_size <tt>int</tt>: The number of samples per channel of
* available space in \a pcm.
* If this is less than the maximum packet duration
* (120 ms; 5760 for 48kHz), this function will not be capable
* of decoding some packets. In the case of PLC (data==NULL)
* or FEC (decode_fec=1), then frame_size needs to be exactly
* the duration of audio that is missing, otherwise the
* decoder will not be in the optimal state to decode the
* next incoming packet. For the PLC and FEC cases, frame_size
* <b>must</b> be a multiple of 2.5 ms.
* @param decode_fec <tt>int</tt>: Flag (0 or 1) to request that any in-band
* forward error correction data be decoded.
* If no such data is available, the frame is
* decoded as if it were lost.
* @returns Number of samples decoded on success or a negative error code
* (see @ref opus_errorcodes) on failure.
*/
OPUS_EXPORT OPUS_WARN_UNUSED_RESULT int opus_multistream_decode_float(
OpusMSDecoder *st,
const unsigned char *data,
opus_int32 len,
float *pcm,
int frame_size,
int decode_fec
) OPUS_ARG_NONNULL(1) OPUS_ARG_NONNULL(4);
/** Perform a CTL function on a multistream Opus decoder.
*
* Generally the request and subsequent arguments are generated by a
* convenience macro.
* @param st <tt>OpusMSDecoder*</tt>: Multistream decoder state.
* @param request This and all remaining parameters should be replaced by one
* of the convenience macros in @ref opus_genericctls,
* @ref opus_decoderctls, or @ref opus_multistream_ctls.
* @see opus_genericctls
* @see opus_decoderctls
* @see opus_multistream_ctls
*/
OPUS_EXPORT int opus_multistream_decoder_ctl(OpusMSDecoder *st, int request, ...) OPUS_ARG_NONNULL(1);
/** Frees an <code>OpusMSDecoder</code> allocated by
* opus_multistream_decoder_create().
* @param st <tt>OpusMSDecoder</tt>: Multistream decoder state to be freed.
*/
OPUS_EXPORT void opus_multistream_decoder_destroy(OpusMSDecoder *st);
/**@}*/
/**@}*/
#ifdef __cplusplus
}
#endif
#endif /* OPUS_MULTISTREAM_H */

View File

@@ -1,159 +0,0 @@
/* (C) COPYRIGHT 1994-2002 Xiph.Org Foundation */
/* Modified by Jean-Marc Valin */
/*
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/* opus_types.h based on ogg_types.h from libogg */
/**
@file opus_types.h
@brief Opus reference implementation types
*/
#ifndef OPUS_TYPES_H
#define OPUS_TYPES_H
/* Use the real stdint.h if it's there (taken from Paul Hsieh's pstdint.h) */
#if (defined(__STDC__) && __STDC__ && __STDC_VERSION__ >= 199901L) || (defined(__GNUC__) && (defined(_STDINT_H) || defined(_STDINT_H_)) || defined (HAVE_STDINT_H))
#include <stdint.h>
typedef int16_t opus_int16;
typedef uint16_t opus_uint16;
typedef int32_t opus_int32;
typedef uint32_t opus_uint32;
#elif defined(_WIN32)
# if defined(__CYGWIN__)
# include <_G_config.h>
typedef _G_int32_t opus_int32;
typedef _G_uint32_t opus_uint32;
typedef _G_int16 opus_int16;
typedef _G_uint16 opus_uint16;
# elif defined(__MINGW32__)
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef int opus_int32;
typedef unsigned int opus_uint32;
# elif defined(__MWERKS__)
typedef int opus_int32;
typedef unsigned int opus_uint32;
typedef short opus_int16;
typedef unsigned short opus_uint16;
# else
/* MSVC/Borland */
typedef __int32 opus_int32;
typedef unsigned __int32 opus_uint32;
typedef __int16 opus_int16;
typedef unsigned __int16 opus_uint16;
# endif
#elif defined(__MACOS__)
# include <sys/types.h>
typedef SInt16 opus_int16;
typedef UInt16 opus_uint16;
typedef SInt32 opus_int32;
typedef UInt32 opus_uint32;
#elif (defined(__APPLE__) && defined(__MACH__)) /* MacOS X Framework build */
# include <sys/types.h>
typedef int16_t opus_int16;
typedef u_int16_t opus_uint16;
typedef int32_t opus_int32;
typedef u_int32_t opus_uint32;
#elif defined(__BEOS__)
/* Be */
# include <inttypes.h>
typedef int16 opus_int16;
typedef u_int16 opus_uint16;
typedef int32_t opus_int32;
typedef u_int32_t opus_uint32;
#elif defined (__EMX__)
/* OS/2 GCC */
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef int opus_int32;
typedef unsigned int opus_uint32;
#elif defined (DJGPP)
/* DJGPP */
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef int opus_int32;
typedef unsigned int opus_uint32;
#elif defined(R5900)
/* PS2 EE */
typedef int opus_int32;
typedef unsigned opus_uint32;
typedef short opus_int16;
typedef unsigned short opus_uint16;
#elif defined(__SYMBIAN32__)
/* Symbian GCC */
typedef signed short opus_int16;
typedef unsigned short opus_uint16;
typedef signed int opus_int32;
typedef unsigned int opus_uint32;
#elif defined(CONFIG_TI_C54X) || defined (CONFIG_TI_C55X)
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef long opus_int32;
typedef unsigned long opus_uint32;
#elif defined(CONFIG_TI_C6X)
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef int opus_int32;
typedef unsigned int opus_uint32;
#else
/* Give up, take a reasonable guess */
typedef short opus_int16;
typedef unsigned short opus_uint16;
typedef int opus_int32;
typedef unsigned int opus_uint32;
#endif
#define opus_int int /* used for counters etc; at least 16 bits */
#define opus_int64 long long
#define opus_int8 signed char
#define opus_uint unsigned int /* used for counters etc; at least 16 bits */
#define opus_uint64 unsigned long long
#define opus_uint8 unsigned char
#endif /* OPUS_TYPES_H */

View File

@@ -1,42 +0,0 @@
#include <stdlib.h>
#include <opus_multistream.h>
#include "nv_opus_dec.h"
OpusMSDecoder* decoder;
// This function must be called before
// any other decoding functions
int nv_opus_init(int sampleRate, int channelCount, int streams,
int coupledStreams, const unsigned char *mapping) {
int err;
decoder = opus_multistream_decoder_create(
sampleRate,
channelCount,
streams,
coupledStreams,
mapping,
&err);
return err;
}
// This function must be called after
// decoding is finished
void nv_opus_destroy(void) {
if (decoder != NULL) {
opus_multistream_decoder_destroy(decoder);
}
}
// packets must be decoded in order
// a packet loss must call this function with NULL indata and 0 inlen
// returns the number of decoded samples
int nv_opus_decode(unsigned char* indata, int inlen, short* outpcmdata, int framesize) {
int err;
// Decoding to 16-bit PCM with FEC off
// Maximum length assuming 48KHz sample rate
err = opus_multistream_decode(decoder, indata, inlen,
outpcmdata, framesize, 0);
return err;
}

View File

@@ -1,4 +0,0 @@
int nv_opus_init(int sampleRate, int channelCount, int streams,
int coupledStreams, const unsigned char *mapping);
void nv_opus_destroy(void);
int nv_opus_decode(unsigned char* indata, int inlen, short* outpcmdata, int framesize);

View File

@@ -1,71 +0,0 @@
#include "nv_opus_dec.h"
#include <stdlib.h>
#include <jni.h>
static int SamplesPerChannel;
static int ChannelCount;
// This function must be called before
// any other decoding functions
JNIEXPORT jint JNICALL
Java_com_limelight_nvstream_av_audio_OpusDecoder_init(JNIEnv *env, jobject this, int sampleRate,
int samplesPerChannel, int channelCount, int streams,
int coupledStreams, jbyteArray mapping) {
jbyte* jni_mapping_data;
jint ret;
SamplesPerChannel = samplesPerChannel;
ChannelCount = channelCount;
jni_mapping_data = (*env)->GetByteArrayElements(env, mapping, 0);
ret = nv_opus_init(sampleRate, channelCount, streams, coupledStreams,
(const unsigned char*)jni_mapping_data);
(*env)->ReleaseByteArrayElements(env, mapping, jni_mapping_data, JNI_ABORT);
return ret;
}
// This function must be called after
// decoding is finished
JNIEXPORT void JNICALL
Java_com_limelight_nvstream_av_audio_OpusDecoder_destroy(JNIEnv *env, jobject this) {
nv_opus_destroy();
}
// packets must be decoded in order
// a packet loss must call this function with NULL indata and 0 inlen
// returns the number of decoded bytes
JNIEXPORT jint JNICALL
Java_com_limelight_nvstream_av_audio_OpusDecoder_decode(
JNIEnv *env, jobject this, // JNI parameters
jbyteArray indata, jint inoff, jint inlen, // Input parameters
jbyteArray outpcmdata) // Output parameter
{
jint ret;
jbyte* jni_input_data;
jbyte* jni_pcm_data;
jni_pcm_data = (*env)->GetByteArrayElements(env, outpcmdata, 0);
if (indata != NULL) {
jni_input_data = (*env)->GetByteArrayElements(env, indata, 0);
ret = nv_opus_decode((unsigned char*)&jni_input_data[inoff], inlen,
(jshort*)jni_pcm_data, SamplesPerChannel);
// The input data isn't changed so it can be safely aborted
(*env)->ReleaseByteArrayElements(env, indata, jni_input_data, JNI_ABORT);
}
else {
ret = nv_opus_decode(NULL, 0, (jshort*)jni_pcm_data, SamplesPerChannel);
}
// Convert samples (2 bytes) per channel to total bytes returned
if (ret > 0) {
ret *= ChannelCount * 2;
}
(*env)->ReleaseByteArrayElements(env, outpcmdata, jni_pcm_data, 0);
return ret;
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 99 KiB

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.2 KiB

After

Width:  |  Height:  |  Size: 5.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 65 KiB

After

Width:  |  Height:  |  Size: 14 KiB

View File

@@ -50,6 +50,7 @@
android:layout_height="65dp"
android:cropToPadding="false"
android:scaleType="fitXY"
android:nextFocusForward="@id/helpButton"
android:layout_alignParentStart="true"
android:layout_alignParentLeft="true"
android:layout_alignParentTop="true"
@@ -62,6 +63,7 @@
android:layout_height="65dp"
android:cropToPadding="false"
android:scaleType="fitXY"
android:nextFocusForward="@id/manuallyAddPc"
android:layout_alignParentStart="true"
android:layout_alignParentLeft="true"
android:layout_below="@id/settingsButton"

View File

@@ -6,7 +6,7 @@
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin"
tools:context=".AddComputerManually" >
tools:context=".preferences.AddComputerManually" >
<TextView
android:id="@+id/manuallyAddPcText"

View File

@@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/activity_help"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin"
tools:context="com.limelight.HelpActivity">
<WebView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentTop="true"
android:layout_alignParentLeft="true"
android:layout_alignParentStart="true" />
</RelativeLayout>

View File

@@ -7,6 +7,6 @@
android:paddingTop="@dimen/activity_vertical_margin"
android:paddingBottom="@dimen/activity_vertical_margin"
android:id="@+id/stream_settings"
tools:context=".StreamSettings">
tools:context=".preferences.StreamSettings">
</RelativeLayout>

View File

@@ -10,5 +10,6 @@
android:columnWidth="160dp"
android:focusable="true"
android:focusableInTouchMode="true"
android:nextFocusLeft="@id/settingsButton"
android:gravity="center"/>
</LinearLayout>

View File

@@ -10,5 +10,6 @@
android:columnWidth="105dp"
android:focusable="true"
android:focusableInTouchMode="true"
android:nextFocusLeft="@id/settingsButton"
android:gravity="center"/>
</LinearLayout>

View File

@@ -0,0 +1,5 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@color/ic_launcher_background"/>
<foreground android:drawable="@mipmap/ic_launcher_foreground"/>
</adaptive-icon>

View File

@@ -0,0 +1,5 @@
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@color/ic_pc_scut_background"/>
<foreground android:drawable="@mipmap/ic_pc_scut_foreground"/>
</adaptive-icon>

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 715 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.6 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 471 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.7 KiB

Some files were not shown because too many files have changed in this diff Show More