venerdì 27 dicembre 2013

End of the year post

Hi guys, long time no see. I usually don't make this kind of posts since this is more a "programming spot for nerds" than an actual blog about me. Anyway, I wanted to write something about myself for once, this year is about to end and it brought some major changes in my life. I opened this blog because I was bored and I wanted to dedicate some of my time by helping other with the only thing useful I can bring to the web: my passion for programming, videogame and whatever. However in the process of studying and working I've lost all my interest in the things that made me happy. This and realizing a bunch of things about myself made me become sad and insecure, I've lost the concept of who I am. Hopefully things will change, I've never made things like resolution list, but I want to make one, not only because of the new year crap, but also for helping me out becoming the person I wanted to be to begin with. So, here we go with the new year resolution:

  1. Dedicate more time to my passions (including making music, playing videogames, drawing).
  2. Start thinking and making new apps and games, even if they sound stupid or awful.
  3. Stop passive working and studying, try to find the fun thing for both of the things.
  4. Improve myself as a programmer. I've still a lot of things I want to learn about it in general, I'll find the time to experiment with php, javascript, jquery, tomcat, other.
  5. Be more open with friends and myself.
  6. Generally post some useful or fun thing on the blog.
  7. Start worrying less about money, I'll manage financial problems one step at a time.
  8. Be happy.
An empty smile can't be warm as a real one. I'll try my best.


domenica 1 dicembre 2013

Comunicazione tra Script > Unity3d

Esclusivamente per i miei fellow amici italici riporterò di seguito parte della mia tesi che discute in maniera breve ma concisa uno dei concetti fondamentali di Unity3d. Si parlerà di comunicazione tra i vari script presenti nella scena associati ai vari gameObjects. Un qualsiasi script “attaccato” ad un oggetto nella scena, viene trattato come un componente, quando si crea un nuovo script questo diventa immediatamente un nuovo “tipo” identificato dal nome dato allo stesso. Possiamo accedere a funzioni e proprietà pubbliche del component di un determinato oggetto semplicemente richiamando GetComponent. Per evitare di passare dal componente è possibile utilizzare la funzione SendMessage derivabile dal gameobject, questa si occupa di cercare ed eseguire la funzione il cui nome gli viene passato in input (appartenente a qualsiasi script attaccato a quell’oggetto). Nell’esempio vediamo come è possibile trovare da codice un oggetto nella scena e richiamarne a seguito una sua funzione.
function Start()
{
    var controlCenter : GameObject; //dichiaro una variabile di tipo GameObject
    controlCenter = GameObject.Find("ControlCenter"); //restituisce l'oggetto ControlCenter della scena
    controlCenter.GetComponent(MusicCenter).toogle_music = true; //modifico una proprietà dello script
    
    controlCenter.SendMessage("set_music", true); 
    //richiamo la funzione set_music da qualsiasi script del controlCenter
    //set_music prende in input un booleano, lo passo come secondo parametro di SendMessage
}

Tramite le funzioni Find e FindGameObjectsWithTag della classe GameObject è possibile ottenere i riferimenti agli oggetti presenti nella scena. Ogni proprietà a cui accediamo tramite GetComponent deve essere public o internal.

venerdì 29 novembre 2013

Usare la GUI, mantenere le proporzioni > Unity3D

Unity permette di definire la grafica di uno specifico elemento della GUI attraverso degli oggetti chiamati GUIStyle, definibili via codice e attraverso l’inspector. Proprietà di questi oggetti sono le immagini assegnate per ogni stato dell’elemento (Normal, Hover, Active), font utilizzato dal testo, bordi, margini, padding, ecc.
Tutto quello che deve essere visualizzato nella Graphical User Interface deve essere definito via codice all’interno della funzione OnGUI; anche gli eventi come la pressione di un bottone sono richiamati a partire da OnGUI.
Posizione e dimensione degli elementi sono gestiti attraverso un tipo di variabile chiamato Rect; i valori utilizzati sono relativi alla finestra dell’applicazione e hanno origine nel suo angolo in alto a sinistra. Non è raro che un applicazione giri su schermi che hanno diverse risoluzioni, soprattutto nel caso di dispositivi android, in cui ogni telefono ha uno schermo di dimensioni differenti dagli altri. Il risultato ottenuto con le Rect è quello di elementi di dimensione o posizione differenti a seconda della risoluzione degli schermi.
Per mantenere la GUI invariata è possibile giocare sulle proporzioni, partendo da due valori di altezza e larghezza fissati. Nell’esempio vediamo come applicare una trasformazione sulla matrice della GUI, in modo da mantenerla invariata per ogni diverso tipo di risoluzione:
private var originalWidth = 480.0; // risoluzione scelta per creare 
private var originalHeight = 800.0; // i contenuti della GUI 
private var scala : Vector3; 
function OnGUI() { 
scala.x = Screen.width/originalWidth; // calcola il rapporto orizzontale 
scala.y = Screen.height/originalHeight; // calcola il rapporto verticale 
scala.z = 1; var svMat = GUI.matrix; // Sostituisco la matrice, solo le proporzioni sono invariate rispetto l'originale 
GUI.matrix = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, scala);
/////////////////////////////////////////////// 
//Definizione della GUI 
/////////////////////////////////////////////// 
GUI.matrix = svMat; // ripristino la matrice originale 
}
Una volta definita la matrice di scala possiamo iniziare a definire gli elementi appartenenti alla nostra GUI.

domenica 17 novembre 2013

Kagerou Project Outer Science Update

Yeah, I spent all day working on this, since the game is released on windows only now I've added more extra to the level. But maybe I've overdid it, I can't get an S at this level, even if it's my own game. I'll leave you the download link, also, updated the download page.
At the beginning the level is locked, you unlock the level when you have at least one S (in hard mode on any level).

I would like to see some gameplay video of this on youtube! Let me know if someone does any!

venerdì 8 novembre 2013

Outer Science Update - Kagerou Project Game

Hi guys, if you are still interest in the game I think I'll have some free time for working on my projects in the next weeks or so. As I've already told I've plans to add Outer Science and Sunset Yesterday to the game, so witch of the following version of the song do you prefer?
JubyPhonic Cover
Whoever it is cover
Awesome cover I don't know who made it
Len Arranged version

Let me know in the comments, I'll decide witch version I'll put in the game.


mercoledì 30 ottobre 2013

Vector3, muovere un oggetto > Unity3D


Fondamentale per ogni buon programmatore di videogiochi è capire come funzionano i vettori. Ogni game object nella scena presenta di default il componente Transform, al suo interno tre diversi oggetti di tipo Vector3 sono accessibili per poterne definire posizione, dimensione e grandezza nello spazio. 
function Start () {
    var pos : Vector3 = Vector3(5,5,5);
    pos.x += 15;
    var dim : Vector3 = Vector3(2,2,2);
    var rot : Vector3 = Vector3(90,0,0);
    //assegno i valori al mio gameobject
    transform.position = pos;
    transform.localScale = dim;
    //Passiamo da angoli di Euler a Quaternion
    transform.rotation = Quaternion.FromToRotation(rot, Vector3.up);
}
Vector3 è l’oggetto utilizzato per instanziare un nuovo vettore, questi può essere utilizzato non solo per definire una posizione, ma anche uno spostamento, contenendo informazioni sulla direzione, verso e intensità del movimento. Lo stesso discorso si può applicare anche alla rotazione e alla dimensione del game object, diciamo che in generale un vettore può essere utilizzato per creare delle semplici trasformazioni su di essi. Nell’esempio vediamo come è possibile spostare un oggetto della scena in una direzione, ricordandoci di utilizzare la variabile deltaTime della classe Time, che permette di rendere uniforme il movimento per ogni diverso tipo di processore.
function Update () {
    muovi(Vector3.up, 0.5);
}
function muovi(dir : Vector3, speed : float)
{
    //sfrutto solo il verso del vettore
    //e lo moltiplico per una mia intensità
    dir = dir.normalized * speed;
    transform.position += dir * Time.deltaTime;
}


Matematicamente è possibile calcolare una direzione dati due punti semplicemente effenduandone la differenza, la direzione di questo vettore è la retta passante per i due punti, il verso è diretto verso il punto sottratto e l’intensità è proporzionale alla distranza tra i due punti.

----------------------------------------------------

Preso direttamente dalla mia tesi, ragionamento molto base, utile per chi si avvicina per la prima volta a Unity3D. Per approfondimenti guardare questo link.
For english readers just look at this page: http://docs.unity3d.com/Documentation/Manual/DirectionDistanceFromOneObjectToAnother.html

sabato 19 ottobre 2013

Cutting video with Xuggler > Java

In case someone needs this information, this is how I managed to cut a video with xuggler. I don't wanna waste time explaining how it works since I raccomand reading my other blog post about ffmpeg, I hope the only code can help. The Cutter class needs the path to the video file and two vectors with the start and ends of each cut of the video you want to make.
package it.xuggler.demo;

import java.awt.image.BufferedImage;
import java.io.File;

import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.MediaListenerAdapter;
import com.xuggle.mediatool.MediaToolAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.IAudioSamplesEvent;
import com.xuggle.mediatool.event.IReadPacketEvent;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.mediatool.event.IWriteHeaderEvent;
import com.xuggle.mediatool.event.IWritePacketEvent;
import com.xuggle.mediatool.event.IWriteTrailerEvent;
import com.xuggle.xuggler.Global;

public class Cutter extends MediaListenerAdapter {
 
 //public static double START_POINT = 10.0 * Global.DEFAULT_PTS_PER_SECOND;
 //public static double END_POINT = 25.0 * Global.DEFAULT_PTS_PER_SECOND;
 private String TMP_DIR;
 
 public static void main(String[] args)
 {
  //--> due vettori
  double s[] = new double[3];
  s[0] = 5;
  s[1] = 20;
  s[2] = 46;
  double e[] = new double[3];
  e[0] = 10;
  e[1] = 35;
  e[2] = 55;
  new Cutter(s,e,args[0],"out");
 }
 
 //Devo utilizzare diversi writer, uno per ogni parte del filmato da tagliare
 private IMediaWriter writers[];
 
 public Cutter(double[] starts, double[] ends, String videoPathin, String videoPathout)
 {
  writers = new IMediaWriter[starts.length];
  IMediaReader reader = ToolFactory.makeReader(videoPathin);
  reader.setBufferedImageTypeToGenerate(BufferedImage.TYPE_3BYTE_BGR);

  TMP_DIR = videoPathout + "_tmp";
  File tmpdir = new File(TMP_DIR);
  tmpdir.mkdir(); //creo una cartella temporanea per salvere i frammenti
  //passo da secondi a nanosecondi
  for(int i = 0; i < starts.length; i++)
  {
   starts[i]*=Global.DEFAULT_PTS_PER_SECOND;
   ends[i]*=Global.DEFAULT_PTS_PER_SECOND;
   writers[i] =  ToolFactory.makeWriter(TMP_DIR+"/p"+i+".flv", reader); //comprende il nome del file
  }
  
  //creazione di un tool che mi taglia il video nei punti scelti
  videoCheck checkPos = new videoCheck(); //videocheck estende MediaToolAdapter
  reader.addListener(checkPos);
  //IMediaWriter writer = ToolFactory.makeWriter(videoPathout+".flv", reader); //comprende il nome del file
  
  boolean updatedS = false;
  boolean updatedE = false;
  
  int rp = 0; //Relative Position, cambia in base allo scorrere delle parti del filmato
  //E' importante che le parti siano disgiunte
  //reader.
  while(reader.readPacket() == null)
  {
   if(!updatedS && (checkPos.timeInMilisec >= starts[rp]))
   {
    System.out.print("\n" + rp);
    updatedS = true; //da un certo punto inizio a convertire
    updatedE = false;
    checkPos.addListener(writers[rp]);
   }
   
   if(!updatedE && (checkPos.timeInMilisec >= ends[rp] ))
   {
    System.out.print("-" + rp);
    updatedE = true; //arrivato ad un certo punto smetto di convertire
    checkPos.removeListener(writers[rp]);
    writers[rp].close();
    rp++; //passo alla prossima parte del filmato
    if(rp == starts.length)
    { //se sono arrivato alla fine
     System.out.print("\nCLOSE\n");
     //writer.close(); //smetto di convertire
    }
    else
     updatedS = false;
   }
  }
  
  String OUT_FILE = videoPathout+".flv";
  //Ottenuti i file separati li riunisco in un unico file
  concatenateVideoFromWriters(OUT_FILE);
  
}

public void concatenateVideoFromWriters(String OUT_FILE)
{
 //Se il frammento è unico allora devo solo spostare e rinominare
 if(writers.length == 1)
 {
  new File(writers[0].getUrl()).renameTo(new File(OUT_FILE));
 }
 else
 {
  //concateno i primi due e proseguo dal terzo
  new MyConcatenateAudioAndVideo().concatenate(writers[0].getUrl(),writers[1].getUrl(),TMP_DIR+"/d1.flv");
  int i;
  for(i = 2; i < writers.length; i++)
   new MyConcatenateAudioAndVideo().concatenate(TMP_DIR+"/d"+(i-1)+".flv",writers[i].getUrl(),TMP_DIR+"/d"+i+".flv");
  //sposto il file
  new File(TMP_DIR+"/d"+(i-1)+".flv").renameTo(new File(OUT_FILE));
 }
 //elimino tutto
 deleteAllFromTmpFolder();
}

public void deleteAllFromTmpFolder()
{
 System.out.print("deleting tmpfile and folder"+ TMP_DIR +" \n");
 File td = new File(TMP_DIR);
 String[] fileslist = td.list();
 for(String fpath : fileslist)
 {
  System.out.print(fpath+" \n");
  new File(TMP_DIR+"/"+fpath).delete(); //elimino i file nella cartella
 }
 td.delete(); //elimino la cartella
}

 class videoCheck extends MediaToolAdapter
{
 //Devono essere millisecondi 
  public Long timeInMilisec = (long) 0;
  public boolean convert = true;
  
  @Override
    public void onVideoPicture(IVideoPictureEvent event)
  {
   timeInMilisec = event.getTimeStamp();  //mi ritorna il preciso istante in MICROsecondi
    //adesso chiamo la superclasse che continua con la manipolazione
 
    if(convert)
   super.onVideoPicture(event);
  }
  
  @Override
   public void onAudioSamples(IAudioSamplesEvent event)
  {
   if(convert)
    super.onAudioSamples(event);
  }
  
  @Override
   public void onWritePacket(IWritePacketEvent event)
  {
   if(convert)
    super.onWritePacket(event);
  }
  
  @Override
   public void onWriteTrailer(IWriteTrailerEvent event)
  {
   if(convert)
    super.onWriteTrailer(event);
  }
  
  @Override
   public void onReadPacket(IReadPacketEvent event)
  {
   if(convert)
    super.onReadPacket(event);
  }
  
  @Override
   public void onWriteHeader(IWriteHeaderEvent event)
  {
   if(convert)
    super.onWriteHeader(event);
  }
}
 
}

Here the MyConcatenateAudioAndVideo class
package it.xuggler.demo;

/*
 * Copyright (c) 2008, 2009 by Xuggle Incorporated.  All rights reserved.
 * 
 * This file is part of Xuggler.
 * 
 * You can redistribute Xuggler and/or modify it under the terms of the GNU
 * Affero General Public License as published by the Free Software
 * Foundation, either version 3 of the License, or (at your option) any
 * later version.
 * 
 * Xuggler is distributed in the hope that it will be useful, but WITHOUT
 * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
 * FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Affero General Public
 * License for more details.
 * 
 * You should have received a copy of the GNU Affero General Public License
 * along with Xuggler.  If not, see .
 */


import java.io.File;

import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.IMediaViewer;
import com.xuggle.mediatool.IMediaWriter;
import com.xuggle.mediatool.MediaToolAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.AudioSamplesEvent;
import com.xuggle.mediatool.event.IAddStreamEvent;
import com.xuggle.mediatool.event.IAudioSamplesEvent;
import com.xuggle.mediatool.event.ICloseCoderEvent;
import com.xuggle.mediatool.event.ICloseEvent;
import com.xuggle.mediatool.event.IOpenCoderEvent;
import com.xuggle.mediatool.event.IOpenEvent;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.mediatool.event.VideoPictureEvent;
import com.xuggle.xuggler.IAudioSamples;
import com.xuggle.xuggler.IVideoPicture;

import static java.lang.System.out;
import static java.lang.System.exit;

/** 
 * A very simple media transcoder which uses {@link IMediaReader}, {@link
 * IMediaWriter} and {@link IMediaViewer}.
 */

public class MyConcatenateAudioAndVideo
{
  /**
   * Concatenate two files.
   * 
   * @param args 3 strings; an input file 1, input file 2, and an output file.
   */
  
  public static void main(String[] args)
  {
    if (args.length < 3)
    {
      out.println("Concatent two files.  The destination " +
        "format will be guessed from the file extention.");
      out.println("");
      out.println("   ConcatentateTwoFiles   ");
      out.println("");
      out.println(
        "The destination type will be guess from the supplied file extsion.");
      exit(0);
    }

    File source1 = new File(args[0]);
    File source2 = new File(args[1]);
    
    if (!source1.exists())
    {
      out.println("Source file does not exist: " + source1);
      exit(0);
    }

    if (!source2.exists())
    {
      out.println("Source file does not exist: " + source2);
      exit(0);
    }

    concatenate(args[0], args[1], args[2]);
  }

  /**
   * Concatenate two source files into one destination file.
   * 
   * @param sourceUrl1 the file which will appear first in the output
   * @param sourceUrl2 the file which will appear second in the output
   * @param destinationUrl the file which will be produced
   */
  
  public static void concatenate(String sourceUrl1, String sourceUrl2,
    String destinationUrl)
  {
    out.printf("transcode %s + %s -> %s\n", sourceUrl1, sourceUrl2,
      destinationUrl);

    // video parameters

    final int videoStreamIndex = 0;
    final int videoStreamId = 0;
    final int width = 640;
    final int height = 360;

    // audio parameters

    final int audioStreamIndex = 1;
    final int audioStreamId = 0;
    final int channelCount = 2;
    final int sampleRate = 11025; // Hz

    // create the first media reader

    IMediaReader reader1 = ToolFactory.makeReader(sourceUrl1);

    // create the second media reader

    IMediaReader reader2 = ToolFactory.makeReader(sourceUrl2);

    // create the media concatenator

    MediaConcatenator concatenator = new MediaConcatenator(audioStreamIndex,
      videoStreamIndex);

    // concatenator listens to both readers

    reader1.addListener(concatenator);
    reader2.addListener(concatenator);

    // create the media writer which listens to the concatenator

    IMediaWriter writer = ToolFactory.makeWriter(destinationUrl);
    concatenator.addListener(writer);

    // add the video stream

    writer.addVideoStream(videoStreamIndex, videoStreamId, width, height);

    // add the audio stream

    writer.addAudioStream(audioStreamIndex, audioStreamId, channelCount,
      sampleRate);

    // read packets from the first source file until done

    while (reader1.readPacket() == null)
      ;

    // read packets from the second source file until done

    while (reader2.readPacket() == null)
      ;

    // close the writer

    writer.close();
  }
  
  static class MediaConcatenator extends MediaToolAdapter
  {
    // the current offset
    
    private long mOffset = 0;
    
    // the next video timestamp
    
    private long mNextVideo = 0;
    
    // the next audio timestamp
    
    private long mNextAudio = 0;

    // the index of the audio stream
    
    private final int mAudoStreamIndex;
    
    // the index of the video stream
    
    private final int mVideoStreamIndex;
    
    /**
     * Create a concatenator.
     * 
     * @param audioStreamIndex index of audio stream
     * @param videoStreamIndex index of video stream
     */
    
    public MediaConcatenator(int audioStreamIndex, int videoStreamIndex)
    {
      mAudoStreamIndex = audioStreamIndex;
      mVideoStreamIndex = videoStreamIndex;
    }
    
    public void onAudioSamples(IAudioSamplesEvent event)
    {
      IAudioSamples samples = event.getAudioSamples();
      
      // set the new time stamp to the original plus the offset established
      // for this media file

      long newTimeStamp = samples.getTimeStamp() + mOffset;

      // keep track of predicted time of the next audio samples, if the end
      // of the media file is encountered, then the offset will be adjusted
      // to this time.

      mNextAudio = samples.getNextPts();

      // set the new timestamp on audio samples

      samples.setTimeStamp(newTimeStamp);

      // create a new audio samples event with the one true audio stream
      // index

      super.onAudioSamples(new AudioSamplesEvent(this, samples,
        mAudoStreamIndex));
    }

    public void onVideoPicture(IVideoPictureEvent event)
    {
      IVideoPicture picture = event.getMediaData();
      long originalTimeStamp = picture.getTimeStamp();

      // set the new time stamp to the original plus the offset established
      // for this media file

      long newTimeStamp = originalTimeStamp + mOffset;

      // keep track of predicted time of the next video picture, if the end
      // of the media file is encountered, then the offset will be adjusted
      // to this this time.
      //
      // You'll note in the audio samples listener above we used
      // a method called getNextPts().  Video pictures don't have
      // a similar method because frame-rates can be variable, so
      // we don't now.  The minimum thing we do know though (since
      // all media containers require media to have monotonically
      // increasing time stamps), is that the next video timestamp
      // should be at least one tick ahead.  So, we fake it.
      
      mNextVideo = originalTimeStamp + 1;

      // set the new timestamp on video samples

      picture.setTimeStamp(newTimeStamp);

      // create a new video picture event with the one true video stream
      // index

      super.onVideoPicture(new VideoPictureEvent(this, picture,
        mVideoStreamIndex));
    }
    
    public void onClose(ICloseEvent event)
    {
      // update the offset by the larger of the next expected audio or video
      // frame time

      mOffset = Math.max(mNextVideo, mNextAudio);

      if (mNextAudio < mNextVideo)
      {
        // In this case we know that there is more video in the
        // last file that we read than audio. Technically you
        // should pad the audio in the output file with enough
        // samples to fill that gap, as many media players (e.g.
        // Quicktime, Microsoft Media Player, MPlayer) actually
        // ignore audio time stamps and just play audio sequentially.
        // If you don't pad, in those players it may look like
        // audio and video is getting out of sync.

        // However kiddies, this is demo code, so that code
        // is left as an exercise for the readers. As a hint,
        // see the IAudioSamples.defaultPtsToSamples(...) methods.
      }
    }

    public void onAddStream(IAddStreamEvent event)
    {
      // overridden to ensure that add stream events are not passed down
      // the tool chain to the writer, which could cause problems
    }

    public void onOpen(IOpenEvent event)
    {
      // overridden to ensure that open events are not passed down the tool
      // chain to the writer, which could cause problems
    }

    public void onOpenCoder(IOpenCoderEvent event)
    {
      // overridden to ensure that open coder events are not passed down the
      // tool chain to the writer, which could cause problems
    }

    public void onCloseCoder(ICloseCoderEvent event)
    {
      // overridden to ensure that close coder events are not passed down the
      // tool chain to the writer, which could cause problems
    }
  }
}

giovedì 17 ottobre 2013

Rocket Spline [ info and download ]

Hi guys! If everything worked just as planned, today I finally got my degree in computer science (sorta).
As a project I presented a game made in Unity 3.5 called Rocket Spline, the game works for PC windows and Android devices. It's a puzzle game where the user define the path of a rocket by inserting "control points" into the scene.
Download Links

This is my first not-free game on the market, I hope you like it!

venerdì 4 ottobre 2013

Kagero Project Fan Game [>Download Page<]


--------------------------------------------------------
So, what happened? Why did I make so long to finally publish this thing? Well, not really much happened. I tried co contact Jin until now, twitter, email, IA contact form, nothing. I have to thank a lot of kind people that offered me help and translated stuff from italian to japanese, but in the end, a month passed and still no answer. So what now? I'm releasing a game without the permission of the guy who makes the songs!? Yes.
I'll shut everything down eventually, and it's a fan game I'm not making any money from it.
Anyway, you will notice from the trailer that someone really AWESOME did some audio tracks for the game, yes I'm talking about JubyPhonic. I've to thank her for spreading so much the word about my game, I hope people will be aware of my works in the future.

And now some questions and answers:
-What happened to the android version?
Unfortunately I've to wait for Jin permission for that one, since I plan to publish it on google play in the future.
-There will be more songs in the game?
I don't know. My life is kinda busy at the moment, I would really like to put outer science and sunset yesterday in the game, I think it will depends on how much free time I'll have and how many people are really interested in my game.
-What about iOS version of the game?
Sorry guys, I really don't have the money to buy a mac.
-I like you, I like the game, how can I help?
I doubt someone would like me or my game this much. Anyway, I've made some android application in the past and I'll publish some other games in the market in the future. If you like, give them a try. (You know guys, it's really fun pressing ads on my apps for some reason).

This is it. Have fun with my game.

domenica 15 settembre 2013

Updates on Kagerou Project fan game

Hi guys, just a little update regarding the project. The game is complete I guess, in the past few days I didn't really have much time to work on it, I started working and I've almost finished with my thesis. If was almost complete when I posted the gameplay video. The reason I still didn't have released the game is that I think it's not fair doing this without a replay from Jin and Shido, since I'm using their works. I tried writing to Jin using the contact form from his website, but still no reply. I think the problem is that I try to contact him in english. I tried to ask for help on 2-3 website, only to be turned down. If someone know a bit of japanese please contact me, so that I can get a proper answer.
If I don't get an answer I'll consider to publish the game like any other fan game, hoping it doesn't trouble the owner of the songs. However I don't think I would be able to publish the android version of the game this way. I hope in the future I have a chance to continue with this game, so that I can add more song and features, I really like Outer Science and I want to do something spectacular with it.
Anyway, good news! Here are some of the changes in the new version:
  • You unlock Ayano by getting a good grade in hard mode in any level.
  • The score rating is now better, it's not so insane to get an S.
  • Getting an S in hard mode will unlock a new difficulty.
  • You can skip the logo with back/esc.
Now I don't want this project to die so easily, and I might find the time to fix it while I wait for a reply.
If you want to try the game, contact me. I'll give the windows version to 5 people who have a youtube channel and that can help me by spreading the word posting youtube gameplay videos or review.
Now, thanks again everyone, see you later.

sabato 7 settembre 2013

Kagerou Project Game - Update

I'm really happy people liked the game. I have to thanks JubyPhonic for helping me spreading the word, so first of all thank you all guys for encouraging me with this!

Seems like my dropbox public account has been disabled because too much people downloaded my game, my bad. I'll be sure to publish my next update on a different host, so that everybody can download it without problems. Speaking of update, here is a video showing you what I'm working on right now:
Now, I'm waiting Jin's approvation, I tried to contact him on twitter and via the IA project website. If anyone wants to help me with this I'll be really happy, I don't know how to contact him in Japanese.

Now, see ya in the next update!

Worms bar shoot example > Unity3d

Just one of my quick exercise I did when I started to play with Unity. Here we see the code that produces the bullet given a certain direction, you can increase the power of the shoot by holding space.

var gui_grandezza : int = 256;
var potenza : float = 0;
private var increasing : boolean = false;
var shooting : boolean = false;
var barraspeed : int = 100;
var proiettile : GameObject;
var spawnpoint : Transform;
var shotforce : float = 5;
var cubi : GameObject;
private var currentCrates : GameObject;
var blastPart : ParticleEmitter;
var altezza : GUITexture;

function Start(){
 guiTexture.pixelInset.width = 0;
 var somecrates : GameObject = Instantiate(cubi, Vector3(8,1,-6), transform.rotation);
 currentCrates = somecrates;
}

function Update(){
 if(!shooting && Input.GetButtonDown("Jump")){
  increasing = true;
 }
 var vertical : float = Input.GetAxis("Vertical") * -1;
 spawnpoint.Rotate(Vector3(vertical,0,0) * Time.deltaTime * 10);
 altezza.guiTexture.pixelInset.y += vertical * -1;
 
 if(!shooting && Input.GetButtonUp("Jump")){
  increasing = false;
  shoot(potenza);
  potenza = 0;
 }
 
 if(increasing){
  potenza += Time.deltaTime * barraspeed;
  potenza = Mathf.Clamp(potenza, 0, gui_grandezza);
  guiTexture.pixelInset.width = potenza;
 }
 
}

function shoot(potenza : float){
 shooting = true;
 var pBlast : ParticleEmitter = Instantiate(blastPart, spawnpoint.position, spawnpoint.rotation); //just a particle emitter
 //base blast amount on power argument, and divide it to diminish power
 pBlast.maxEmission = potenza/10;
 var pFab : GameObject = Instantiate(proiettile, spawnpoint.position, spawnpoint.rotation); //that's our bullet
 pFab.rigidbody.AddForce(spawnpoint.forward * potenza * shotforce);

//we reset the envoirment after 4 seconds
 Destroy(pFab.gameObject, 4);
 Destroy(pBlast.gameObject, 1);
 yield WaitForSeconds(4);
 potenza = 0;
 guiTexture.pixelInset.width = 0;
 Destroy(currentCrates);
 shooting = false;
 var somecrates2 : GameObject = Instantiate(cubi, Vector3(8,1,-6), transform.rotation);
 currentCrates = somecrates2;
}


When we hold the "jump" button (space by default) the variable increasing is set to true. When increasing is true we use potenza += Time.deltaTime * barraspeed; potenza = Mathf.Clamp(potenza, 0, gui_grandezza); to add power to our shoot. When the jump button is released Input.GetButtonUp("Jump") returns true, at this point we call the function shoot(potenza) and we set increasing to false. The function shoot instantiate a particle emitter and our bullet. The bullet pFab has a rigidbody, we use AddForce to make it go forward from our swappoint with a velocity of potenza * shotforce. Most of the variables names are in italian, I hope it's not that difficult to follow, in the end it's a really simple example, here you can download the full project.
-------------------
ITA: Il codice cerca di simulare la barra dello sparo di Worms, tenendo premuto incrementiamo la variabile potenza, al rilascio del tasto spazio usiamo la funzione shoot per instanziare un rigidbody al quale aggiungeremo una forza tramite la funzione AddForce. La stessa funzione shoot si occupa di resettare la scena, l'intero esempio è scaricabile dal seguente link.

mercoledì 4 settembre 2013

Get public ip address by code > Java

It can be useful to your application to know what's your public ip address. Instead of getting the information from your network card, that is not sure to have your public address registered but only your local ip and the gateway, you can use a service like http://checkip.amazonaws.com .
By making an http request at checkip.amazonaws we can easly get what's our ip address.

public static String getIp() throws Exception {
        URL whatismyip = new URL("http://checkip.amazonaws.com");
        BufferedReader in = null;
        try {
            in = new BufferedReader(new InputStreamReader(
                    whatismyip.openStream()));
            String ip = in.readLine();
            return ip;
        } finally {
            if (in != null) {
                try {
                    in.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }
--------------------
ITA: La funzione sovrastante si occupa di restituire l'idirizzo il publico della macchina facendo una richiesta http ad un servizio offerto da amazon. Solitamente si utilizza vedere cosa è presente sulle porte della scheda di rete, tuttavia non sempre questo approccio porta ad una soluzione, poiché in esse è solo salvato l'indirizzo della rete locale. Si può usare amazon come anche un altro servizio da noi implementato su un nostro server, per avere maggiore affidabilità in futuro.

MekakuCityDays project - A rhythmic game

In the last three days I got bored and I started to develop a new rhythmic game with Unity that can work both on pc or android.
Since I'm a fan of kagerou project, a series of japanese songs made by Jin, I decided to make a little fan game.
I'm open for suggestion and I really need someone who can test it. I don't know how much more free time I'll have to spend on it, for now here a pre-alpha: DOWNLOAD
In the .rar there is an .apk if you want to play it on your device.


Featured songs are: Konoha's State of The World, Headphone Actor, Shounen Brave.
UPDATE: Blindfoded Code, Night Talk Deceive
Gameplay: Use they buttons - >
  • A or left arrow : for the left trigger
  • G or down arrow: for the middle trigger
  • L or right arrow: for the right trigger
  • Esc : for the menu/pause
ITA: I tasti da usare sono A,G,L. Il tasto Esc mette in pausa/menu.
Il gioco si ispira alla serie sopra citata in inglese.

UPDATE: Seems like so many people downloaded my game that dropbox decided to lock my public account. Don't worry tough, in the next few days I'll come with a new update featuring one more "hidden" song, so please, stay tuned!
Also, if someone know how to contact Jin regarding this project, please let me know.
See this.

domenica 25 agosto 2013

Converting video with JAVE

JAVE is a java library developed by sauronsoftware, it's a wrapper of FFmpeg and it's an easy way to converting video from a format to another. The main class of jave is it.sauronsoftware.jave.Encoder, Encoder objects have methods that transcode multimedia files.
First thing to do is adding JAVE to your CLASSPATH, if you are using eclipse simply import jave-1.0.jar in your project. Now in your code create an object of Encoder:
Encoder encoder = new Encoder();
Now you just call encode() to do the work, let's see it's signature:
public void encode(java.io.File source,
                   java.io.File target,
                   it.sauronsoftware.jave.EncodingAttributes attributes)
            throws java.lang.IllegalArgumentException,
                   it.sauronsoftware.jave.InputFormatException,
                   it.sauronsoftware.jave.EncoderException
the first argument (source) is the file you want to transcode, the second one (target) is the file you want to create on the machine. The argument attributes of type it.sauronsoftware.jave.EncodingAttributes is a structure that has all the propriety we want to give to our video. Attention: the method is syn, it returns after the video transcoding. Let's see how it's possible to convert a video by setting VideoAttributes and AudioAttributes:
 public static void convertVideo(String videoInput, String videoName) throws IllegalArgumentException, InputFormatException, EncoderException
 {
  String videoOutput = videoName + ".flv";
  Encoder encoder = new Encoder();
  
  File source = new File(videoInput);
  File target = new File(videoOutput);
  AudioAttributes audio = new AudioAttributes();
  audio.setCodec("libmp3lame");
  audio.setBitRate(new Integer(Config.AUDIO_BITRATE * 1000));
  audio.setChannels(new Integer(Config.AUDIO_CHANNELS));
  audio.setSamplingRate(new Integer(Config.AUDIO_SAMPLINGRATE));
  VideoAttributes video = new VideoAttributes();
  video.setCodec("flv");
  video.setBitRate(new Integer(Config.VIDEO_BITRATE*10000));
  video.setFrameRate(new Integer(Config.VIDEO_FRAMERATE));
  video.setSize(new VideoSize(Config.VIDEO_WIDTH, Config.VIDEO_HEIGHT));
  EncodingAttributes attrs = new EncodingAttributes();
  attrs.setFormat("flv");
  
  attrs.setAudioAttributes(audio);
  attrs.setVideoAttributes(video);
  
  encoder.encode(source, target, attrs);
 }
The video is converted in .flv format with given framerate, width, height and bitrate. The audio is transcoded in libmp3lame with given samplingrate. The values I've used to make a not so heavy .flv file are the following:
public class Config {
 public static int VIDEO_WIDTH = 640;
 public static int VIDEO_HEIGHT = 360;
 public static int VIDEO_FRAMERATE = 15;
 public static int VIDEO_BITRATE = 128;
 
 public static int AUDIO_SAMPLINGRATE = 22050;
 public static int AUDIO_CHANNELS = 1;
 public static int AUDIO_BITRATE = 64;
}
That's all, for media manipulation I suggest you to see FFmpeg and my previous post to give you the idea of how to do it in java. --------------------------------------------------------------------
ITA
La documentazione di JAVE è disponibile in italiano sul sito, il mio esempio si occupa di codificare un video da un formato input compatibile a .flv con le configurazioni date nella classe Config.

martedì 20 agosto 2013

Transcoding and Media Modification > Java - FFmpeg

Nel corso di un lavoro mi è capitato di dover scrivere del codice che permettesse di effettuare delle veloci modifiche ad un filmato, quali conversioni tra formati e tagli. Dopo aver guardato diverse librerie mi sono accorto che la migliore soluzione è quella di utilizzare ffmpeg.exe, un eseguibile che permette di effettuare semplici operazioni su file multimediali, tagli, merge, cattura di un immagine.

Per effettuare delle elaborazioni video è necessario richiamare ffmpeg.exe da linea di comando, passati i corretti parametri, ad esmpio:
ffmpeg -i foo.avi -r 1 -s WxH -f image2 foo.jpeg
Questa riga permette di ottenere un immagine .jpeg di dimensioni WxH del primo frame del filmato.
La documentazione di ffmpeg può essere trovata a questo indirizzo, tra i comandi più importanti voglio citare:

  • -r fps -> specifica il frame rate in output
  • -s size -> specifica la dimensione del filmato con sintassi LarghezzaxAltezza
  • -vcodec codec -> specifica il codec in output
Una volta capito come funziona l'eseguibile è facile creare un wrapper in Java, basta utilizzare la classe Runtime per ottenere il runtime di sistema. Di seguito vediamo da codice una funzione che presi in input due istanti di tempo, il percorso del video in input e in output utilizza ffmpeg per creare un video tagliato.


	public static void singleCut(double start, double end,String videoPathIn, String videoPathOut) throws IllegalArgumentException, InputFormatException, EncoderException, IOException, InterruptedException
	{
		String cmd = Config.FFMPEG+" -i "+ videoPathIn +" -q 5 -ss "+ start +" -to "+ end +" -y "+videoPathOut;
		//System.out.print(cmd);
		Runtime runtime = Runtime.getRuntime();
		Process p = runtime.exec(cmd);
		p.waitFor();
	}
Nel codice sovrastante Config.FFMPEG non è altro che una variabile statica con il percorso relativo al nostro file ffmpeg.exe .
Per effettuare delle conversioni ai filmati via codice vorrei anche segnalare questa libreria sviluppata da un italiano chiamata JAVE.
----------------------------------------------------------------------
ENG:
To transcode and execute single tasks on multimedia files by code the best way I found myself using is to create a wrapper around FFmpeg.exe. As you can see by the documentation this tool permits to do anything on video and audio files, things like extrapolate images or perform single cuts, are really easy and only require to read how to do it. For example the following line permits to take an image from the first frame in the video with width and height as WxH:
ffmpeg -i foo.avi -r 1 -s WxH -f image2 foo.jpeg
To easily call FFmpeg in Java simply use Runtime class to perform what you usually do in the command line.

	public static void singleCut(double start, double end,String videoPathIn, String videoPathOut) throws IllegalArgumentException, InputFormatException, EncoderException, IOException, InterruptedException
	{
		String cmd = Config.FFMPEG+" -i "+ videoPathIn +" -q 5 -ss "+ start +" -to "+ end +" -y "+videoPathOut;
		//System.out.print(cmd);
		Runtime runtime = Runtime.getRuntime();
		Process p = runtime.exec(cmd);
		p.waitFor();
	}
The string cmd is the line we want the code to execute by doing runtime.exec(cmd), Config.FFMPEG is just a static string with the absolute path of FFmpeg.exe.
I would recommend to take a look at JAVE if you just want to do simple video trascoding, it's quite nice and fast to use.

lunedì 19 agosto 2013

Blogger > Posting Code

Quando mi sono registrato su blogger mi aspettavo che non ci fosse nessuno strumento per permettere di postare cascate di codice che risulti poi leggibile. Fortunatamente è possibile integrare qualsiasi cosa nel modello del blog semplicemente modificandone l'html.
Per ottenere lo stesso effetto usato nel blog andate in I miei blog > modello > modifica html , successivamente copiate quanto segue dopo la chiusura del tag head ( </head> ).

 
 
 




Adesso ogni volta che effettuate un nuovo post andate su HTML e utilizzate:

//Put your Java code here

per scrivere codice Java e

//Put your xml code here

per l'Xml

------------------------
ENG: 1-Go to your blog.
2-Then click on Template
3-The click on Edit Html
4-And copy paste the first code before closing Head tag
5-Use the last writed tag while posting

Credits to SyntaxHighlighter

domenica 18 agosto 2013

Equalizer > Android > Java

Chi di voi ha già programmato per android avrà notato quanto è vasto l'ambiente di sviluppo, Google mette a disposizione classi ben documentate per qualsiasi componente presente nel telefono. Tempo fa mi è capitato di utilizzare il componente Equalizer e BassBoost, dei quali però trovai poca documentazione, di seguito vediamo come ottenere questi oggetti e come gestire le loro proprietà in un Activity separata.

Main Activity
package it.test.equalizertest;

import android.media.AudioManager;
import android.media.MediaPlayer;
import android.media.audiofx.BassBoost;
import android.media.audiofx.Equalizer;
import android.os.Bundle;
import android.app.Activity;
import android.content.Intent;
import android.util.Log;
import android.view.Menu;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;

public class MainActivity extends Activity {

 MediaPlayer mp;
 public static Equalizer equalizer;
 public static BassBoost bassboost;
 String SONG_PATH = "/storage/sdcard0/media/audio/06-soft-rock-star.mp3";
 //Stringa con il percorso della canzone
 @Override
 protected void onCreate(Bundle savedInstanceState) {
  super.onCreate(savedInstanceState);
  setContentView(R.layout.activity_main);
  
  //MediaPlayer
     mp = new MediaPlayer();
     equalizer = new Equalizer(99999999, mp.getAudioSessionId()); //Ottengo l'Equalizer con alta priorità
     bassboost = new BassBoost(999999, mp.getAudioSessionId()); //Ottengo il bassboost con priorità 99999

     setVolumeControlStream(AudioManager.STREAM_MUSIC);

  //Abilita Equalizer
  int val = equalizer.setEnabled(true);
     if(val != Equalizer.SUCCESS)
      Log.v("A", "EQUALIZER NON ATTIVO" + val);
     val = bassboost.setEnabled(true);
     if(val != Equalizer.SUCCESS)
      Log.v("A", "BASSBOOST NON ATTIVO" + val);
     else
      Log.v("A", "SUCCESS!");
     
     //Button Play plays a song in SONG_PATH
     Button btn_play = (Button) findViewById(R.id.button_play);
     btn_play.setOnClickListener(new OnClickListener() {
   
   @Override
   public void onClick(View v) {
    try{
      mp.reset();
      mp.setDataSource(SONG_PATH); //setto il file audio come datasource
      mp.prepare();
      mp.start(); //faccio partire il mediaplayer
     }
    catch(Exception ex){};
   }
  });
     
     Button btn_eq = (Button) findViewById(R.id.button_equalizer);
     btn_eq.setOnClickListener(new OnClickListener() {
   
   @Override
   public void onClick(View v) {
    openEqualizer(); //apro una nuova activity
   }
  });
 }

 @Override
 public void onDestroy()
 {
   super.onDestroy();
   
   if(mp != null){
    mp.release(); //rilascio il media player
   }
   if(equalizer != null)
    equalizer.release();
   if(bassboost != null)
    bassboost.release();
 }
 
 @Override
 public boolean onCreateOptionsMenu(Menu menu) {
  // Inflate the menu; this adds items to the action bar if it is present.
  getMenuInflater().inflate(R.menu.main, menu);
  return true;
 }
 
 public void openEqualizer()
 {
  Intent i = new Intent(MainActivity.this, EqualizerActivity.class);
  startActivity(i); //apre l'equalizer
  //l'oggetto equalizer è passato come statico
 }

}
Quello che abbiamo fatto fin ora è ottenere dal dispositivo gli oggetti Equalizer e BassBoost, i quali sono unici e per questo motivo va specificata la priorità con la quale li richiamiamo, applicazioni come n7Player hanno processi che mantengono il controllo su equalizer, bassboost e altro con massima priorità, per cui è indispensabile in un applicazione mettere un controllo per specificare se il programma ha i privilegi per modificare le proprieà dell'equalizer. Tramite la costante Equalizer.SUCCESS possiamo verificare quanto detto precedentemente, basta guardare il codice in alto.
In una nuova activity creiamo via codice gli slider necessari per il controllo dell'Equalizer:
EqualizerActivity
package it.test.equalizertest;

import android.app.Activity;
import android.media.audiofx.BassBoost;
import android.media.audiofx.Equalizer;
import android.os.Bundle;
import android.util.Log;
import android.view.Gravity;
import android.view.ViewGroup;
import android.widget.LinearLayout;
import android.widget.SeekBar;
import android.widget.TextView;

public class EqualizerActivity extends Activity {

    private Equalizer equalizer;
    private BassBoost bassboost;
    private short bbs = 0;
    TextView bbTextView;
    private static final short BASSBOOST_MAX_STRENGTH = 1000; //MIN = 0

    @Override
    public void onCreate(Bundle icicle) {
        super.onCreate(icicle);
        equalizer = MainActivity.equalizer; //preso dall'activity -> sempre attivo
        bassboost = MainActivity.bassboost;
        int val = equalizer.setEnabled(true);
        if(val != Equalizer.SUCCESS)
         Log.v("A", "EQUALIZER NON ATTIVO " + val);
        setupEqualizerFXandUI();
        
    }
    
    
    //generato dinamicamente a seconda delle bande percepite
    private void setupEqualizerFXandUI()
    {
     TextView eqTextView = new TextView(this);
     eqTextView.setText("Equalizer:");
     LinearLayout ll = new LinearLayout(this);
     ll.setOrientation(LinearLayout.VERTICAL);
     ll.addView(eqTextView);
     
     setContentView(ll);
     
      short bands = equalizer.getNumberOfBands(); //numero di bande di frequenza modificabili
      final short minEQLevel = equalizer.getBandLevelRange()[0]; //minimo per banda
      final short maxEQLevel = equalizer.getBandLevelRange()[1]; //massimo per banda
      
      LinearLayout.LayoutParams layoutParams = new LinearLayout.LayoutParams(ViewGroup.LayoutParams.FILL_PARENT, ViewGroup.LayoutParams.WRAP_CONTENT);
   layoutParams.weight = 1;
      
      for(short i = 0; i< bands; i++)
      {
       final short band = i;
       //Log.v("A", "B "+ band);
       TextView freqTv = new TextView(this);
       freqTv.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.FILL_PARENT, ViewGroup.LayoutParams.WRAP_CONTENT));
       freqTv.setGravity(Gravity.CENTER_HORIZONTAL);
       freqTv.setText((equalizer.getCenterFreq(band) /1000) + " Hz");
       ll.addView(freqTv);
       
       LinearLayout row = new LinearLayout(this);
       row.setOrientation(LinearLayout.HORIZONTAL);
       
       TextView minDbTv = new TextView(this);
       minDbTv.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT));
       minDbTv.setText((minEQLevel / 100) + " dB");
       
       TextView maxDbTv = new TextView(this);
       maxDbTv.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT));
       maxDbTv.setText((maxEQLevel / 100) + " dB");
       
       SeekBar bar = new SeekBar(this);
       bar.setLayoutParams(layoutParams);
       bar.setMax(maxEQLevel - minEQLevel);
       bar.setProgress(equalizer.getBandLevel(band)); // Volume della banda
       
       bar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
    
    @Override
    public void onStopTrackingTouch(SeekBar seekBar) {
    }
    
    @Override
    public void onStartTrackingTouch(SeekBar seekBar) {
     
    }
    
    @Override
    public void onProgressChanged(SeekBar seekBar, int progress,
      boolean fromUser) {
     equalizer.setBandLevel(band, (short)(progress + minEQLevel));
     Log.v("A", "LEVEL: " + (progress + minEQLevel));
    }
   });
       
       row.addView(minDbTv);
       row.addView(bar);
       row.addView(maxDbTv);
       
       ll.addView(row);
      }
      
     //BASS BOOST
     bbs = bassboost.getRoundedStrength();
     bbTextView = new TextView(this);
      bbTextView.setText("BassBoost: " + bbs);
      
      SeekBar bar = new SeekBar(this);
   bar.setLayoutParams(layoutParams);
   bar.setMax(BASSBOOST_MAX_STRENGTH);
   bar.setProgress(bassboost.getRoundedStrength());
   
   bar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
   
   @Override
   public void onStopTrackingTouch(SeekBar seekBar) {
   }
   
   @Override
   public void onStartTrackingTouch(SeekBar seekBar) {
    
   }
   
   @Override
   public void onProgressChanged(SeekBar seekBar, int progress,
     boolean fromUser) {
    bbs = (short) progress;
    bassboost.setStrength(bbs);
    bbTextView.setText("BassBoost: " + bbs);
   }
  });
      
      LinearLayout row = new LinearLayout(this);
  row.setOrientation(LinearLayout.HORIZONTAL);
  
  row.addView(bbTextView);
  row.addView(bar);
  ll.addView(row);
    }
}


Il codice sovrastante è autoesplicativo, per chi non ha troppa confidenza con android consiglio di vedere come si creano dinamicamente i layout e le view, con equalizer.setEnabled ricontrolliamo che l'oggetto sia in funzione per la nostra app,  equalizer.getNumberOfBands() ritorna il numero di bande che possiamo modificare, con equalizer.setBandLevel possiamo settare il volume bassato uno specifico id di banda.


La documentazione completa per questa classe è disponibile a questa pagina.
L'esempio completo può essere scaricato al seguente link: https://dl.dropboxusercontent.com/u/23802589/EqualizerTest.rar