This is a port of eSpeak to Android for 4.0 (Ice Cream Sandwich) and later (including Android 4.3), supporting all 79 languages and accents of eSpeak (see http://reecedunn.co.uk/espeak-for-android for the list).
It is built on the eyes-free version, fixing multiple issues found in the eyes-free version:
1. Compatibility with Android 4.3 (as of 1.47.11d);
2. Speech rate and pitch are correctly handled, so eSpeak on Android sounds like it does on the desktop;
3. Speech rate is set as words per minute
4. Pitch, base pitch and volume are configurable as percentages.
5. The eSpeak variant can be selected, including the NVDA voice variants.
6. Punctuation level and punctuation characters are configurable.
7. Languages are correctly selected (e.g. selecting Slovak and Slovenian, or selecting Cantonese Chinese);
8. Language names are correctly displayed (e.g. "Scottish English" is displayed as "English (United Kingdom, Scottish Standard English)" instead of "English (Saychelles)");
9. Accent and special characters are supported;
10. MIPS-based devices are supported;
11. Various memory leaks and crashes are fixed;
12. SSML is only processed if the SSML markup is wrapped in a tag.
13. Support for importing custom-built eSpeak dictionaries.
NOTE: When enabling eSpeak, you will get a message stating "This speech synthesis engine may be able to collect all the text that will be spoken, including personal data like passwords and credit card numbers. It comes from the eSpeak engine. Enable the use of this speech synthesis engine?". This is a standard warning issued by Android devices for accessibility software that comes from an external source.
The eSpeak engine does not collect any data in the text passed to it. All the source code to the core eSpeak engine and the Android port is available for you to verify this if you are concerned.
Any issues should be reported to https://github.com/rhdunn/espeak/issues.
The flags in the feature graphic are licensed under Creative Commons Attribution-ShareAlike by Wikipedia. The lips were designed by Jonathan Duddington. The application icon and feature graphic were designed by Reece H. Dunn.
The eSpeak source code is licensed under GPLv3+ by Jonathan Duddington.
The Android port is derived from the eyes-free port by Google under the Apache 2.0 license. Additional modifications have been made by Reece H. Dunn, also under the Apache 2.0 license.
The Unicode character handling (for correct handling of accented and non-Latin characters) is provided by the ucd-tools project by Reece H. Dunn, licensed under the GPLv3+ license. This project uses data tables generated from the Unicode Character Database (UCD) by http://unicode.org.
OpenCV Manager needed . If not installed, the application will ask download.
It takes at least two faces saved so you can begin to recognize
Training Mode: Write the name of the person, focus and when it begins to appear a box locating a face press "Rec". Press Rec repeatedly to store different gestures
Find mode. Focus on one face and if recognized, its name appears. An icon will appear green, yellow or red depending on the degree of confidence in recognizing
Button "View All": See the faces stored.
+ Version 2.2: Storage moved to internal storage.
Source code: https://github.com/ayuso2013/face-recognition
It is a very basic app that loads the last 50 tweets that contain the word Android when it starts. It does nothing else. It is meant as a learning aid for Android developers. All source code is available here: https://github.com/posco2k8/rest_loader_tutorial.
Dumbledroid is a framework that enables integration between an Android app and a RESTful server using magic. (Actually, it's not real magic. I'm kidding.)
Using Dumbledroid, the developer doesn't have to write parsers for JSON or XML documents from a web service. It maps the document nodes to the class fields and does this automagically!
On this app, you'll find working examples of simple JSON and XML integrations, as well as a Flickr picture search. All of them use Dumbledroid to connect with the webservices.
Read more about the project on its Github page.
Logo credits: www.androidify.com
The app's source code is available alongside the library it uses at https://github.com/Fusion/CFRAnimated
If you are not an application developer, you may find it hypnotizing but, other than that, of little interest to you.
Demo app fix for a spelling error of the receiver for low battery
Some fix for bugs in the froyo version
Fix no provider available error
Tracking more information with analytics to evaluate the quality of the position the library is delivering to the app
Asking for the user if the app was successful in getting the position with a dialog on back button
Its source code is available on https://github.com/pulsation/spektro-magnet, and collected data can be displayed using its companion web application Spektro Hydra available on https://github.com/pulsation/spektro-hydra.
=== What this is about ===
We have wanted to try REPL-based game development for a long time, and this is our attempt.
A REPL is like a command-line interface to the inside of your running program. It's like having a debugger running constantly, but perhaps less troublesome.
REPL-based development should allow us to develop games and apps much faster. Your new code takes effect immediately, and can be executed on your target hardware.
Bret Victor has a very nice talk about immediate feedback in his "Inventing on Principle" talk: http://vimeo.com/36579366
This project is an attempt to accomplish this, and so far it seems to have worked fairly well!
=== How we are doing this ===
The demo combines three third-party libraries:
- Chicken Scheme for the REPL
- Cocos2Dx for graphics
- Chipmunk for physics
Cocos2Dx is a C++ library with 2D-games in mind. It lets you manage things like sprites, their animations and touch events. It is fast and portable (Android, iOS and others). Chipmunk is a physics engine writtten in C. It's fast with a very nice API. Chicken Scheme is a Scheme-to-C compiler and interpreter.
All should also run on iOS, but I don't have a Mac.
=== Bindings ===
The Cocos2Dx bindings are in an early-stage. Basic functions to manipulate sprites are available:
- (CCSprite::create "CloseNormal.png")
- (setPosition *sprite* x y)
- (getLocatino touch-event)
The Chipmunk API is more mature. You can read about the `chickmunk` project on https://github.com/kristianlm/chickmunk.
=== Try it yourself ===
You can connect to the REPL directly from your laptop if your phone is on the same WiFi, or use USB. Try Settings->Wireless Networks->Wifi Settings-> [Menu]->Advanced when looking for you phone's IP.
With netcat (or Emacs, with netcat [ip] [port] as your Scheme interpreter), you could try:
$ nc [phone ip] [port]
Alternatively, you could use USB with adb and forward:
$ adb forward tcp:1234 tcp:1234
$ nc localhost 1234
Once you see the REPL prompt @>, you can play around:
;; 'import' chipmunk bindings
;; where is the player?
;; redefine game-loop to pause game unless you're touching the screen
(if *touch-down* (space-step space (/ 1 120))))
;; now let's give the truck a gentle push
(body-set-ang-vel wf -20)
;; now touch the screen to watch it drift off
;; restart the app to revert your changes
;; You can also manipulate the physics-world:
;; Drop a ball from the sky
`(body ((pos (320 700)))
(circle (density 0.001)
;; Add a gentle but slippery slope
`(body ((static 1))
(segment (friction 0.1)
(endpoints ((250 500)
;; type this to see the touch-down state:
;; it should be #f when your finger is off the screen, and touch-coordinates otherwise. evaluate it while holding the screen to try it out!
This is just a small example of what can be done. As a matter of fact, almost everything in the demo itself was developed this way: the REPL, Emacs and Inferior Scheme.
=== Source code ===
The source-code for the demo can be found on github: https://github.com/Adellica/cocoscheme.
Please give feedback, let us know if you like this! And please let us know of any similar efforts. Thanks!
The Bluesmirf Bluetooth module is available from Sparkfun.
Source code is available at https://github.com/jeffboody/bluesmirf-demo
Code for this application is available for free here: https://github.com/joemoore/C2DeMo
This is an option to use instead of HoloEverywhere. It has its pros and cons. The most important pro is that it uses the Android native View and Activity and there is no need to import another package like "org.holoeverywhere.app...". The main disadvantage is that Dialog and DialogFragments are still not supported.
If you need to use it please go to the github repo:
Application isn't fully optimised but may give some ideas how to improve the implementation. Namely, one might want to reduce unnecessary blur draw calls in case nothing has changed in the view background. Also DrawerLayout causes some headache when it draws it's left/right child views :(
You can find full source code on GitHub for further inspection:
- Stack Blur Algorithm by Mario Klingemann <email@example.com>
- Background image Nicolas Pomepuy https://github.com/PomepuyN/BlurEffectForAndroidDesign
- Application icon by someone on the Internet
- ScaleGestureDetector (android framework)
Gesture Library used:
Alternatively, it's a testbed for various technique available in Android and put down in C# form.
This example was explained on my blog post: http://udinic.wordpress.com/2013/04/24/write-your-own-android-authenticator/
Full source code can be found here: https://github.com/Udinic/AccountAuthenticator
More from developer
Having trouble with click volume or other settings? Press the menu button.
Free Software (GPL).
Note that appearance of characters depends on your system font, which in turn is dependent on Android version. Unavailable characters may appear as empty boxes, or as nothing at all.