Final grades up on wolfware!

Folks,

Final grades are up on wolfware, and totalled there. The class did well.

Thanks again for a fun semester.

Let us know if you have any concerns!

We will post these tonight.

Ben.

Project: Arrival (Aagaman)



Team:
Chi-Han Wang, Vamsi Vikash Ankam, Pranesha Shashwath Kumar Kattepura Jayabheema Rao, Mahmoud Tohmaz,  Abhishek Venkataraman, Jake Borland


Tagline: Home away from home.
Background:
Any new student at a University struggles to find a compatible roommate to stay with during their college year. Also, when new students arrive at the local airport, they need someone to pick them up and take them to the university.


Our application is a solution to this problem. One feature of our application is that it allows students to fill out a form detailing their preferences for a roommate like dietary restriction, gender preference, sleep/wake-up time etc.Using our algorithm running in the backend, we will then match students together who share similar preferences. After creating an account and logging in ,you will find a list of all the students seeking roommates in your university. The preferences can be set by clicking the option at the top right corner. This allows students to find a roommate even before they arrive at the university. The persons displayed on the matching list can be contacted just by clicking on the send message button. This way, the new student can immediately move into their new home once they arrive.


Another feature is that before even they arrive, they can find friends who are travelling in the same flight to the same University. For this, the user just has to enter the flight details in the Add Flight Info page which can be found in the drawer layout. Once entered, you can click on flight updates on the drawer activity to see a list of people who are travelling in the same flight to the same University. Once they arrive, students can then request a fellow student, or student organizations to pick them up from the airport.


What we have accomplished:
Clean Android app following Google’s material design principles. Integrating the front end of Android with Bluemix backend server with Cloudant database. Wrote efficient REST APIs for communication between backend and front-end. Resolved many integration issues that came along our way. Collaborated and made the app work efficiently.


Published the app to the Google Play Store.


Screenshots:





















Future work:
  1. Multiple functions  can be added to the application like creating groups for purposes like carpooling, grocery shopping etc.
  2. The application can be integrated with the University’s Office of International Services so that whenever a student enters the flight info, the flight information is automatically pushed as an e-mail to the office so that they can arrange someone from the University to pick the students from the airport.







Project: Dino Digger VR

#DinoDigger


Team:
Keven Desai, Paritosh Desai, Ryu Komiyama, Benjamin Lykins, and Nikunj Shah

Tagline:
We are always looking for ways to interest young students in pursuing higher education. By targeting their interests, we hope to inspire their imagination by creating an interactive virtual space where they can interact with a dinosaur excavation site. The goal of this project is to be an educational experience where students can learn about the excavation process, and also be an authentic representation of the quarry and the bones.

What we have accomplished:
We managed to finish most of the things we set out to at the beginning of this project. We created a virtual reality application that runs on the Android platform and uses Google Cardboard to represent a virtual dinosaur quarry. In the app, the user is able to collect bones and move around the quarry by moving to specific points around the environment. Five bones are scattered around the environment that the user can collect. After the bones are collected, the user can go to the tent at the center of the map to complete their objective. The user is then free to roam the environment for further exploration.

We also put in educational information about the fossils in the environment as that was part of the project requirements. We talk about the bones as they are being collected, as well as information pertaining to the process of excavating dinosaur bones, and what it is like to occupy an excavation site.



Future Possibilities:
  • HTC Vive (when the price lowers significantly from $800)
  • Camera function used to detect change in user coordinates outside of the VR environment 

Screenshots:


Link to Youtube video: https://www.youtube.com/watch?v=hc-hE3NATPA&feature=youtu.be

Link to Github repo: https://github.com/NCSUMobiles/Spring16-dinodigger

Project : TreeFinder


#TreeFinder



Team :
Abbas Hussain , Abbey Lancaster , Bobby Radford , Daniel Streeter , Priyanka Puranaik

Tagline:
Learning about Trees will now be fun for kids.

Background:

You can always Google , check out on Wikipedia or ask somebody about information on trees. That is no fun. Especially if we want the children to know about trees that surround them. TreeFinder fills that gap , by helping children navigate to the most closely related tree that matches the physical qualities of the leaf they have found. The android application gets them to the tree , with a tree picture and a small description one step at a time, by answering questions about the leaves.

The original design was to use the existing Tilt-a-Story components to make it game like.

Future Work:

1. Trivia and Quiz.
2. Better , consistent and original graphics.
3. Picture taking and data gathering functionality.
4. Web services for images instead of  permanent  static links.
5. Extension to more number of trees.
6. Most importantly , interaction that makes it engaging for kids.
7. Similar app for iOS.


Representative Images :

new4  new6 Screenshot (April 26, 2016 9:44 PM)

Github: https://github.com/NCSUMobiles/Spring16-treefinder

Video : TreeFinder on Youtube

DinoRunner

DinoRunner




Make running fun again!

Problem: For many gamers, running and physical activity is rarely a subject to be excited about. Dino Runner seeks to "gamify" running, and combine running with RPG concepts, allowing gamers to have goals to strive for, game, all inside their pocket.

How it works: DinoRunner uses the android phone's accelerometer sensors to detect when the user is running, and maps their distance traveled to the game interface. Players will have the option of picking different tracks, each with their special twists. For example, some tracks might have regions of water, where it will "slow" you down within the game, causing the player to have to run faster in real life. The track also spawns a monster that is specific to the track, and this monster will chase you around. Be careful of getting caught by the monster, because they are quite dangerous, and will reduce the player's health. Upon successfully completing the track, players will be rewarded with experience, items, gold, which can be used towards ingame items that can boost players speed (so they can tackle harder bosses!). With this reward system in place, we hope to make running an enjoyable experience, and hopefully make users forget they were running in the first place.

Unfinished work: A few of the team members are planning on continuing this project, hopefully towards a summer blockbuster release. Before release, we wish to have:

  • MORE TRACKS
  • MORE MONSTERS
  • MORE ITEMS
  • Immersive sound
  • Ingame shop
  • ...AND MORE


PROJECT : Tilt-a-Story


Team Members
Ayush Gupta, Nishtha Garg, Shifali Jain, Sagar Manohar, Sreekanth Ramakrishnan.

Background:

The children's book "Press Here" describes a charming 2D world that the reader can "interact" with by tilting and poking the book. But what if it were real? This project will explore this possibility in a mobile app. We worked on building an API using Unity3d and Playmaker which can be used by designers to build different mobile games and levels on top of it.  

Milestones achieved

  • Understanding the existing code and identify the refactorable components.
  • Built Menu development APIs.
  • Built Environment  development APIs.
  • Built Character development APIs.
  • Built Movement APIs which uses accelerometer.    
  • Built Interaction APIs.    
  • Built Sound integration APIs.  
  • Recreated Old TiltAStory Game and built new levels using the APIs  



   User can customize the menu, enviroment, Character,  Movement,  Interaction, sound according to the game requirement using our custom scripts following the simple steps explained in our designer help document. We have created different scripts for all the parts mentioned above.

Future Work
  • Support custom settings
  • Support more interactions in the game
  • Support 3D games
  • Add more visual effects and animations
Video : https://www.youtube.com/watch?v=-0bm3KrwN8E

GitHub : https://github.com/NCSUMobiles/Spring16-tiltastory

Documentation : https://github.com/NCSUMobiles/Spring16-tiltastory/blob/master/TiltAStory_Documentation.pdf

Presentation : https://github.com/NCSUMobiles/Spring16-tiltastory/blob/master/TiltAStory_presentation.pptx.pdf


Post invites out, extra reading spots up

Folks,

All teams should have invites to post at this time. Let us know if not!

I have added space for extra readings in your grading index.

Please continue to let us know if you have any concerns.

Thanks again for a great semester!

Prof Watson

Project: SquarePeg



Team

Andrew Rather, Youngwoong Lee, Vishal Mishra, Yuang Ni

#SquarePeg

Tagline

Bringing old fashioned video up to speed in a 3D world.

Background:

There is currently no application for viewing a mixture of 360 degree video combined with traditional flat video in a single viewing experience.  To do so would allow for a much better viewing experience in a more practical and applicable fashion.  Traditional video is better at presenting information, while 360 video is better at creating an immersive experience.  Mixing the two styles allows for a wide range of informative and immersive experiences.

Our application is designed to handle this within a web browser and for use with Google Cardboard.  The application allows standard motion control when you are viewing the 360 video but then suspends the controls for traditional video.

Future Work

We hope to include an interface for selecting video later on.  In addition to this, a better transition between the 2 video types would be good as well.

Video

https://youtu.be/j9WricenHlg

Repository

https://github.com/NCSUMobiles/Spring16-squarepeg


Project: VR Timeline



TEAM

Colleen Hutson, Ronak Nisher, Juhi Desai, Akash Agrawal, Hongyi Ma, Kamaria Hardy

TAGLINE

A virtual reality timeline experience that allows a user to go beyond the event.

BACKGROUND

The open source tool Timeline JS allows a user to create a timeline about anything.  Using a Google Sheets template the user provides the tool with the dates, event titles, and pictures.  Timeline JS then creates a clean, organized, functional timeline.  Our challenge was to take this two-dimensional timeline and bring it to life in a virtual reality environment.  We were tasked with implementing an intuitive experience that would provide the user with more information and images than the desktop Timeline JS version.

KEY FEATURES

Upon launching the application the user becomes immersed in a new learning experience
  • Gaze Triggered Buttons:  all buttons with the application are triggered by the user’s gaze.  The user must stare at a button for approximately 5 seconds to “click” it
  • Help Screen: at the launch screen the user may select the help screen that will instructions on how to interact with the timeline
  • Ease Scrolling: the user scrolls through the time by simply looking to the left or to the right.  The further the user looks to the left or right the quicker they scroll through the timeline.
  • Event Buttons: each timeline event is represented with a button and to learn more about a particular event the user triggers it with their gaze
  • Information Panels: upon selecting an event button a panel above and below are set based on that event
    • Some Additional Information: Above the timeline the user will be provided with a brief description of the event and a single image
    • More Additional Information: Below the timeline the user will be provided with additional background on the event and/or extra images
  • More Timelines: the user has the capability of switching to different timelines

FUTURE WORK

  • Currently each timeline must be individually created.  If we can incorporate the Timeline JS excel template into the application, we can improve the efficiency of creating additional timelines and possibly allow each user to individualize what timelines he/she can view
  • In the world of design a world event or economic crisis can have an impact on the design and construction of products.  If we can identify the implications such an event or crisis had on production we can connect the two timelines and allow the user to see how current events influenced product development.

REPOSITORY
https://github.com/RonakNisher/VR-Timeline/tree/Centered_Info

VIDEO

https://youtu.be/q3C9xC8PBwQ

Project: Story Maps

StoryMaps Final Project

#projects

Tagline: Every place has a story to tell.

Description:

Do you miss the old tourist maps that would give you curated content, specialized for your needs? Today, Google Maps dominate when it comes to travel. However, when one needs to explore new areas, where does one start? We don't have many options today to address this need. Thus, we present to you - StoryMaps.

The StoryMaps application allows you to access curated stories for Raleigh. Every story or map trail is based on a different theme. You can select any story that you like and follow the trail of locations provided to start exploring the new place! Each story provides a custom map and distances of all places from your current location.

Future Work:

Adding Search functionality, to search from a list of stories.
Providing direction between locations.
Adding different verticals, for example, stories with a global appeal.


Video: https://www.youtube.com/watch?v=fJirEhxwehM

Repository: https://github.com/NCSUMobiles/Spring16-storymaps

AppStore: https://play.google.com/store/apps/details?id=com.ionicframework.storymaps565921

Representative Image:

Recognize Final Project

#projects

Project: Recognize

Recognize is an imaged based quiz game that tests your ability to perform different mental operations on the main image to match it with one of four image choices as quickly as possible!

Recognize aims to create an innovative visual recognition game that targets school-aged children.

The application is android based and requires players to identify one of the image choices as being somehow related to the main image, which is filtered with some effect (ex: blur, vertical scanner, etc.). The content (images and fun facts) of the game is remotely managed via a Google App Engine.

Future requirements include integrating authentication into the server-side, creating easy/medium/hard levels, and having users upload their own images and albums into the app

Website: https://recognize-1210.appspot.com

Repository: https://github.com/NCSUMobiles/Spring16-recogneyes

Video: https://drive.google.com/file/d/0B7QAYFBHMrFhbGVzRHpBTEtfR28/view?usp=sharing


Apologies! Ignore visual experience note

Apologies folks!

Ignore the previous message about visual experience.

Posted on the wrong course!

Prof watson

Reminder: touch and maps questions

Folks,

You might want to do our touch and maps questions to prepare for the corresponding quizzes during our last Monday class.

Professor Watson

Find: EU lodges formal antitrust complaint against Android

EU lodges formal antitrust complaint against Android
// The Verge - All Posts

The European Union has notified Google of formal antitrust charges against the company relating to its Android mobile operating system. The charge sheet focuses on the company's prioritizing of its own services on Android devices, including practices that mean that Google Search is "pre-installed and set as the default, or exclusive, search service on most Android devices sold in Europe." The EU's investigation, which was originally opened last April, claims that this and other measures prevent companies from effectively competing with Android.

Continue reading…

Find: Intel cuts 12,000 jobs in wake of falling PC sales

Mobile takes a bigger bite out of intel. 

---- 
Intel cuts 12,000 jobs in wake of falling PC sales
// The Verge - All Posts

Intel is laying off 12,000 employees globally, or about 11 percent of its workforce, the company said in a statement today.

Continue reading…

Team evals

Folks,

As we approach this semester's crescendo, please make sure that you perform for your team.

For those of you who feel one or more of your team members are not performing, you can ensure that failure to perform is reflected in their course mark by filling out this form as a team. Your feedback about your team members is anonymous.

Prof. Watson

Next #appfight in @NCState's @mobiclass: Twitter vs Facebook

Don't forget to post your thoughts about this week's social app fight! Remember to focus on comparing basic interactions, not feature differences.

Dual-camera phones are the future of mobile photography

Dual-camera phones are the future of mobile photography
// The Verge - All Posts

Smartphone cameras can do astounding things nowadays, however they are starting to reach some hard physical limits. There's only so much you can achieve within the tight constraints of a device that's 7mm thick, and phone companies are looking for alternative means to keep improving. This spring, LG and Huawei have led the way with their new Android flagships, introducing two very different dual-camera systems that nevertheless signal the direction that the entire industry is about to head in. Apple's iPhone 7 Plus is rumored to be following their lead later this year. One day soon, we'll look at dual-camera phones the way we think of dual-core devices today: just a logical progression with nothing remarkable about it.

LG's G5 will be...

Continue reading…

People interpret the same emoji in completely differently ways

People interpret the same emoji in completely differently ways
// The Verge - All Posts

Although emoji fill in where words are lacking, their true meaning might be getting lost in translation. People interpret emoji differently, according to a new study from the University of Minnesota, Twin Cities. This holds true even if chatting is happening over the same operating system.

Take the "grinning face with smiling eyes" emoji, , for example.  The researchers surveyed online respondents on how they interpreted the emoji's sentiment, rating it on a scale from -5 (strongly negative) to +5 (strongly positive). Seventy percent of people put the face in the negative area of the chart, with most people ranking it at -3. But still, 27 percent of respondents thought it conveyed a more positive emotion. The chart below breaks down the...

Continue reading…

Assignment: projects are up! Please indicate your preferences at catme.org

Folks,

Our projects are set! Please indicate your preferences for them, along with other team making info, at www.catme.org. Do so ASAP!

Professor Watson

Next #appfight in @NCState's @mobiclass: gMessenger vs Facebook Messenger

Don't forget to post your thoughts about next week's messaging app fight! Remember to focus on comparing basic interactions, not feature differences.

No class today!

The university has cancelled all classes starting before noon.

See you Wednesday!

Professor Watson.

#goodquestion from @NCState's @mobiclass: how do we control text size on mobiles?

Hi folks,

The question came up in class recently: how can we control font size and maintain text legibility across the wide range of mobile devices? Is a font point 1/72 of a physical inch, as it would be on paper?

Short answer: Neither length nor pixels will reliably size text across mobile devices. Legible length depends on viewing distance (see below), while the legible number of pixels depends on pixel density. Both vary across devices. But no matter which device displays text, the human eye always views it. According to both the FAA and the FDA, legible text covers roughly 1/4 degrees of visual angle. iOS, Android and web apps all use some sort of device independent unit to control the angular size of text. On iOS, this unit is the point (pt). On Android, this unit is the "device independent pixel" (dp). On the mobile web, this unit is the "CSS pixel" (px). None of these units has a fixed physical length, in contrast to the traditional typographical point.

How legible length depends on viewing distance (W3C)

iOS 

Responsive legibility is a simpler problem on iOS than other platforms, because the number of different iOS devices is relatively small, and because Apple controls the full technology stack (both OS and device).

The device independent unit in iOS is the point (pt). Points can span one, two or three pixels:
  • non-Retina devices: 1pt = 1 pixel, at 163 dpi 12 points = 1/4 deg
  • Retina devices: 1 pt = 2 pixels, at 326 dpi 12 points = 1/4 deg
  • Retina HD devices: 1 pt = 3 pixels, at 489 dpi (scaled to 401 dpi) 12 points  = 1/4 deg

Android

Things are more complex for Android apps, which must function on an extremely wide variety of devices, most of which Google does not control. Android assumes that these devices will be viewed at roughly the same distance, and adjusts text height based on purely on pixel density.

The device independent unit in Android is the density independent pixel (dp), which similar to iOS, is sized relative to a 160 dpi pixel:
  • Kindle Fire: 1dp = 1 pixel, at 160 dpi 12 points = 1/4 deg
  • Nexus S: 1dp = ~1.5 pixels, at 233 dpi 12 points = 1/4 deg
  • Nexus 6: 1dp = ~3 pixels at 493 dpi 12 points = 1/4 deg
Note that to avoid aliasing and ease development, Android specifies that dp-to-pixel ratios be rounded to certain "buckets". Text would look bad if dps were some odd fraction of device pixels. Note further that Google recommends specifying text size using scaleable pixels (sp) so that users can adjust text size to their preference.

Mobile Web

Maintaining legibility is most complex on the web, since the variety of devices includes not only mobiles but also desktops and laptops. In particular, drastic differences in device size create large differences in comfortable viewing distance, meaning that text size cannot be controlled based on pixel density alone.

As google points out on its search ranking pages (where legibility is a factor), the device independent unit on the web is the "CSS pixel" (px). According to the W3C CSS2 standard:
The reference pixel is the visual angle of one pixel on a device with a pixel density of 96 dpi and a distance from the reader of an arm's length.
Surprise! 12px is 1/4 degrees. Likely the variety of devices the web must support led the W3C to adopt a device independent unit based on visual angle rather than a pixels, since this makes no assumptions about viewing distance. Mozilla illustrates how a CSS pixel translates to different lengths on different devices:
  • desktop screen at 28": px = 1/4 mm
  • laptop screen at 22": px = 1/5 mm
  • phone screen at 16": px = 1/6 mm
Don't forget that a CSS pixel is not a device pixel. The number of device pixels per CSS pixel is called the device pixel ratio (DPPX). This is set by manufacturers, who per the above list must consider comfortable viewing distance when doing so. Again, DPPX is typically a round number to avoid aliasing.

Closing Thoughts

Maintaining legibility on mobiles is much more complex than it is on desktops and laptops, because of the great variety in mobile devices. But this development complexity is the new reality: small mobiles are outselling PCs roughly 8 to 1; while large displays are becoming smarter, cheaper and bigger all the time. In the age of the heterogeneous and ubiquitous display, the days of counting device pixels during development are over, and have been for some time.

Note: submit your grading index often if you want up to date feedback

Hey folks,

Submit your grading index often so that we can see what you've done promptly.

There is no need to save up URLs to your online work outside of the index. You can always come back and edit your grading index to add more URLs after you submit.

Prof Watson

Next #appfight in @NCState's @mobiclass: GKeyboard vs Fleksy


Don't forget to post your thoughts about next week's keyboards app fight!

Next #appfight in @NCState's @mobiclass: Chess Free vs lichess


Don't forget to post your thoughts about tomorrow's games app fight today!

Next #appfight in @NCState's @mobiclass: gCamera vs Z Camera

Don't forget to post your thoughts about tomorrow's camera app fight today!

App Fight Posts due today!

Folks,

Don't forget to post your first app fight today! In this week's app fight, two dialers face off.

Tomorrow, student Borland, Chiavegatto, Agrawal and Ankam will lead discussion of the app fight. Thanks for going first, folks!

You can learn more about good app fight posts and presentations here.

Professor Watson

Announcement: make sure you fill out our exercise

Folks,

If you haven't already, please make sure you fill out the exercise form for today's activity on our notes page, as best you can.

Professor Watson