web-dev-qa-db-ja.com

プログラミングによってJavaで画像をビデオファイルにエンコードする方法は?

私は同じ解像度のいくつかの画像を使用してビデオファイルにエンコードしようとしています、そのために試しました:

jCodec

  • jcodec .. example 説明

    ただし、非常に時間がかかり、多数の画像をエンコードする適切なツールではなく、Quick Time拡張機能を作成します。

[〜#〜] ffmpeg [〜#〜]

  • [〜#〜] ffmpeg [〜#〜] .. example 説明

    ただし、ffmpegは画像ファイルからのみビデオを作成できます。イメージは物理システムで作成する必要があります。

Xuggler は、そのAPIをJavaプログラムで使用してビデオファイルを作成できますが、サイトが壊れているように見えるので、試せません。

Java形式でビデオファイルに画像をエンコードする方法を知っている人はいますか?

前もって感謝します !

11
ANUJ SINGH

Xuggler isdeprecated、代わりに Humble-Video を使用します。スクリーンショットを撮ってビデオファイルに変換する方法など、既にいくつかのデモプロジェクトが付属しています。 RecordAndEncodeVideo.Java

/*******************************************************************************
 * Copyright (c) 2014, Art Clarke.  All rights reserved.
 * <p>
 * This file is part of Humble-Video.
 * <p>
 * Humble-Video is free software: you can redistribute it and/or modify
 * it under the terms of the GNU Affero General Public License as published by
 * the Free Software Foundation, either version 3 of the License, or
 * (at your option) any later version.
 * <p>
 * Humble-Video is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 * GNU Affero General Public License for more details.
 * <p>
 * You should have received a copy of the GNU Affero General Public License
 * along with Humble-Video.  If not, see <http://www.gnu.org/licenses/>.
 *******************************************************************************/
package io.humble.video.demos;

import io.humble.video.*;
import io.humble.video.awt.MediaPictureConverter;
import io.humble.video.awt.MediaPictureConverterFactory;
import org.Apache.commons.cli.*;

import Java.awt.*;
import Java.awt.image.BufferedImage;
import Java.io.IOException;

/**
 * Records the contents of your computer screen to a media file for the passed in duration.
 * This is meant as a demonstration program to teach the use of the Humble API.
 * <p>
 * Concepts introduced:
 * </p>
 * <ul>
 * <li>Muxer: A {@link Muxer} object is a container you can write media data to.</li>
 * <li>Encoders: An {@link Encoder} object lets you convert {@link MediaAudio} or {@link MediaPicture} objects into {@link MediaPacket} objects
 * so they can be written to {@link Muxer} objects.</li>
 * </ul>
 *
 * <p>
 * To run from maven, do:
 * </p>
 * <pre>
 * mvn install exec:Java -Dexec.mainClass="io.humble.video.demos.RecordAndEncodeVideo" -Dexec.args="filename.mp4"
 * </pre>
 *
 * @author aclarke
 *
 */
public class RecordAndEncodeVideo
{
    /**
     * Records the screen
     */
    private static void recordScreen (String filename, String formatname, String codecname, int duration, int snapsPerSecond) throws AWTException, InterruptedException, IOException
    {
        /**
         * Set up the AWT infrastructure to take screenshots of the desktop.
         */
        final Robot robot = new Robot();
        final Toolkit toolkit = Toolkit.getDefaultToolkit();
        final Rectangle screenbounds = new Rectangle(toolkit.getScreenSize());

        final Rational framerate = Rational.make(1, snapsPerSecond);

        /** First we create a muxer using the passed in filename and formatname if given. */
        final Muxer muxer = Muxer.make(filename, null, formatname);

        /** Now, we need to decide what type of codec to use to encode video. Muxers
         * have limited sets of codecs they can use. We're going to pick the first one that
         * works, or if the user supplied a codec name, we're going to force-fit that
         * in instead.
         */
        final MuxerFormat format = muxer.getFormat();
        final Codec codec;
        if (codecname != null)
        {
            codec = Codec.findEncodingCodecByName(codecname);
        }
        else
        {
            codec = Codec.findEncodingCodec(format.getDefaultVideoCodecId());
        }

        /**
         * Now that we know what codec, we need to create an encoder
         */
        Encoder encoder = Encoder.make(codec);

        /**
         * Video encoders need to know at a minimum:
         *   width
         *   height
         *   pixel format
         * Some also need to know frame-rate (older codecs that had a fixed rate at which video files could
         * be written needed this). There are many other options you can set on an encoder, but we're
         * going to keep it simpler here.
         */
        encoder.setWidth(screenbounds.width);
        encoder.setHeight(screenbounds.height);
        // We are going to use 420P as the format because that's what most video formats these days use
        final PixelFormat.Type pixelformat = PixelFormat.Type.PIX_FMT_YUV420P;
        encoder.setPixelFormat(pixelformat);
        encoder.setTimeBase(framerate);

        /** An annoynace of some formats is that they need global (rather than per-stream) headers,
         * and in that case you have to tell the encoder. And since Encoders are decoupled from
         * Muxers, there is no easy way to know this beyond
         */
        if (format.getFlag(MuxerFormat.Flag.GLOBAL_HEADER))
        {
            encoder.setFlag(Encoder.Flag.FLAG_GLOBAL_HEADER, true);
        }

        /** Open the encoder. */
        encoder.open(null, null);


        /** Add this stream to the muxer. */
        muxer.addNewStream(encoder);

        /** And open the muxer for business. */
        muxer.open(null, null);

        /** Next, we need to make sure we have the right MediaPicture format objects
         * to encode data with. Java (and most on-screen graphics programs) use some
         * variant of Red-Green-Blue image encoding (a.k.a. RGB or BGR). Most video
         * codecs use some variant of YCrCb formatting. So we're going to have to
         * convert. To do that, we'll introduce a MediaPictureConverter object later. object.
         */
        MediaPictureConverter converter = null;
        final MediaPicture picture = MediaPicture.make(encoder.getWidth(), encoder.getHeight(), pixelformat);
        picture.setTimeBase(framerate);

        /** Now begin our main loop of taking screen snaps.
         * We're going to encode and then write out any resulting packets. */
        final MediaPacket packet = MediaPacket.make();
        for (int i = 0; i < duration / framerate.getDouble(); i++)
        {
            /** Make the screen capture && convert image to TYPE_3BYTE_BGR */
            final BufferedImage screen = convertToType(robot.createScreenCapture(screenbounds), BufferedImage.TYPE_3BYTE_BGR);

            /** This is LIKELY not in YUV420P format, so we're going to convert it using some handy utilities. */
            if (converter == null)
            {
                converter = MediaPictureConverterFactory.createConverter(screen, picture);
            }
            converter.toPicture(picture, screen, i);

            do
            {
                encoder.encode(packet, picture);
                if (packet.isComplete())
                {
                    muxer.write(packet, false);
                }
            } while (packet.isComplete());

            /** now we'll sleep until it's time to take the next snapshot. */
            Thread.sleep((long) (1000 * framerate.getDouble()));
        }

        /** Encoders, like decoders, sometimes cache pictures so it can do the right key-frame optimizations.
         * So, they need to be flushed as well. As with the decoders, the convention is to pass in a null
         * input until the output is not complete.
         */
        do
        {
            encoder.encode(packet, null);
            if (packet.isComplete())
            {
                muxer.write(packet, false);
            }
        } while (packet.isComplete());

        /** Finally, let's clean up after ourselves. */
        muxer.close();
    }

    @SuppressWarnings("static-access")
    public static void main (String[] args) throws InterruptedException, IOException, AWTException
    {
        final Options options = new Options();
        options.addOption("h", "help", false, "displays help");
        options.addOption("v", "version", false, "version of this library");
        options.addOption(OptionBuilder.withArgName("format").withLongOpt("format").hasArg().
                withDescription("muxer format to use. If unspecified, we will guess from filename").create("f"));
        options.addOption(OptionBuilder.withArgName("codec")
                .withLongOpt("codec")
                .hasArg()
                .withDescription("codec to use when encoding video; If unspecified, we will guess from format")
                .create("c"));
        options.addOption(OptionBuilder.withArgName("duration")
                .withLongOpt("duration")
                .hasArg()
                .withDescription("number of seconds of screenshot to record; defaults to 10.")
                .create("d"));
        options.addOption(OptionBuilder.withArgName("snaps per second")
                .withLongOpt("snaps")
                .hasArg()
                .withDescription("number of pictures to take per second (i.e. the frame rate); defaults to 5")
                .create("s"));

        final CommandLineParser parser = new org.Apache.commons.cli.BasicParser();
        try
        {
            final CommandLine cmd = parser.parse(options, args);
            final String[] parsedArgs = cmd.getArgs();
            if (cmd.hasOption("version"))
            {
                // let's find what version of the library we're running
                final String version = io.humble.video_native.Version.getVersionInfo();
                System.out.println("Humble Version: " + version);
            }
            else if (cmd.hasOption("help") || parsedArgs.length != 1)
            {
                final HelpFormatter formatter = new HelpFormatter();
                formatter.printHelp(RecordAndEncodeVideo.class.getCanonicalName() + " <filename>", options);
            }
            else
            {
                /**
                 * Read in some option values and their defaults.
                 */
                final int duration = Integer.parseInt(cmd.getOptionValue("duration", "10"));
                if (duration <= 0)
                {
                    throw new IllegalArgumentException("duration must be > 0");
                }
                final int snaps = Integer.parseInt(cmd.getOptionValue("snaps", "5"));
                if (snaps <= 0)
                {
                    throw new IllegalArgumentException("snaps must be > 0");
                }
                final String codecname = cmd.getOptionValue("codec");
                final String formatname = cmd.getOptionValue("format");
                final String filename = cmd.getArgs()[0];

                recordScreen(filename, formatname, codecname, duration, snaps);
            }
        } catch (ParseException e)
        {
            System.err.println("Exception parsing command line: " + e.getLocalizedMessage());
        }
    }

    /**
     * Convert a {@link BufferedImage} of any type, to {@link BufferedImage} of a
     * specified type. If the source image is the same type as the target type,
     * then original image is returned, otherwise new image of the correct type is
     * created and the content of the source image is copied into the new image.
     *
     * @param sourceImage
     *          the image to be converted
     * @param targetType
     *          the desired BufferedImage type
     *
     * @return a BufferedImage of the specifed target type.
     *
     * @see BufferedImage
     */
    public static BufferedImage convertToType (BufferedImage sourceImage, int targetType)
    {
        BufferedImage image;

        // if the source image is already the target type, return the source image

        if (sourceImage.getType() == targetType)
        {
            image = sourceImage;
        }

        // otherwise create a new image of the target type and draw the new
        // image

        else
        {
            image = new BufferedImage(sourceImage.getWidth(), sourceImage.getHeight(), targetType);
            image.getGraphics().drawImage(sourceImage, 0, 0, null);
        }

        return image;
    }
}

他のデモもチェックしてください: hum​​ble-video-demos

私はwebappでリアルタイムに使用しています。

これをリアルタイムでストリーミングする場合は、 [〜#〜] rtsp [〜#〜] サーバーが必要になります。 Red 5 ServerWowza Streaming Engine のような大きなフレームワークを使用するか、RTSPが組み込まれている Netty を使用して独自のサーバーを構築できます。バージョン3.2以降のコーデック。

12
Onur

コマンドラインを使用して、画像をビデオに変換するさまざまな方法があります。これらのコマンドはJavaで保存に使用できます。これらのコマンドは次のリンクから取得できます。

  1. ffmpegを使用して一連の画像をビデオに変換する
  2. 画像からビデオスライドショーを作成する

私は問題を解決するためにコードスニペットを共有しています:

HTML5キャンバスからpng画像を保存するコード

Base64 decoder = new Base64();
byte[] pic = decoder.decodeBase64(request.getParameter("pic"));
String frameCount = request.getParameter("frame");
InputStream in = new ByteArrayInputStream(pic);
BufferedImage bImageFromConvert = ImageIO.read(in);
String outdir = "output\\"+frameCount;
//Random Rand = new Random();
File file = new File(outdir);
if(file.isFile()){
    if(file.delete()){
        File writefile = new File(outdir);
        ImageIO.write(bImageFromConvert, "png", file);
    }
}

ビデオから画像を作成するためのコード

String filePath = "D:\\temp\\some.mpg";
String outdir = "output";
File file = new File(outdir);
file.mkdirs();
Map<String, String> m = System.getenv();

/*
 * String command[] =
 * {"D:\\ffmpeg-win32-static\\bin\\ffmpeg","-i",filePath
 * ,"-r 30","-f","image2",outdir,"\\user%03d.jpg"};
 * 
 * ProcessBuilder pb = new ProcessBuilder(command); pb.start();
 */
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -i " + filePath
        + " -r 30  -f image2 " + outdir + "\\image%5d.png";
Process p = Runtime.getRuntime().exec(commands);

画像からビデオを作成するためのコード

String filePath = "output";
File fileP = new File(filePath);
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -f image2 -i "
        + fileP + "\\image%5d.png " + fileP + "\\video.mp4";
System.out.println(commands);
Runtime.getRuntime().exec(commands);
System.out.println(fileP.getAbsolutePath());

クレジットは@ yashprit になります


Android開発者向けの別のアプローチ:

  1. Android内に一時フォルダーを作成します。
  2. 新しいフォルダーに画像をコピーします
  3. 最初に、数字のシーケンスに従うように写真の名前を変更します。たとえば、img1.jpg、img2.jpg、img3.jpg、...などを実行できます。
  4. このプログラムをプログラムで実行しますffmpeg -f image2 -i img%d.jpg /tmp/a.mpgこれをプログラムで実行するには、

次のコードを使用します。

void convertImg_to_vid()
{
    Process chperm;
    try {
        chperm=Runtime.getRuntime().exec("su");
          DataOutputStream os = 
              new DataOutputStream(chperm.getOutputStream());

              os.writeBytes("ffmpeg -f image2 -i img%d.jpg /tmp/a.mpg\n");
              os.flush();

              chperm.waitFor();

    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (InterruptedException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

リソースリンク:

  1. ffmpegを使用して画像からビデオファイルを作成
10
SkyWalker

Java Media Frameworkには、Jpegイメージのリストからビデオを作成できるユーティリティがあります リンク

ソースコードは次のとおりです。

JpegImagesToMovie.Java

/*
 * @(#)JpegImagesToMovie.Java   1.3 01/03/13
 * Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
 * Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
 * modify and redistribute this software in source and binary code form,
 * provided that i) this copyright notice and license appear on all copies of
 * the software; and ii) Licensee does not utilize the software in a manner
 * which is disparaging to Sun.
 * This software is provided "AS IS," without a warranty of any kind. ALL
 * EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
 * IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
 * NON-INFRINGEMENT, ARE HEREBY EXCLUDED. Sun AND ITS LICENSORS SHALL NOT BE
 * LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
 * OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL Sun OR ITS
 * LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
 * INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
 * CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
 * OR INABILITY TO USE SOFTWARE, EVEN IF Sun HAS BEEN ADVISED OF THE
 * POSSIBILITY OF SUCH DAMAGES.
 *
 * This software is not designed or intended for use in on-line control of
 * aircraft, air traffic, aircraft navigation or aircraft communications; or in
 * the design, construction, operation or maintenance of any nuclear
 * facility. Licensee represents and warrants that it will not use or
 * redistribute the Software for such purposes.
 */

 package imagetovideo;

 import Java.awt.Dimension;
 import Java.io.File;
 import Java.io.IOException;
 import Java.io.RandomAccessFile;
 import Java.net.MalformedURLException;
 import Java.util.Vector;
 import javax.media.Buffer;
 import javax.media.ConfigureCompleteEvent;
 import javax.media.ControllerEvent;
 import javax.media.ControllerListener;
 import javax.media.DataSink;
 import javax.media.EndOfMediaEvent; 
 import javax.media.Format;
 import javax.media.Manager; 
 import javax.media.MediaLocator;
 import javax.media.PrefetchCompleteEvent;
 import javax.media.Processor;
 import javax.media.RealizeCompleteEvent;
 import javax.media.ResourceUnavailableEvent;
 import javax.media.Time;
 import javax.media.control.TrackControl;
 import javax.media.datasink.DataSinkErrorEvent;
 import javax.media.datasink.DataSinkEvent;
 import javax.media.datasink.DataSinkListener;
 import javax.media.datasink.EndOfStreamEvent;
 import javax.media.format.VideoFormat;
 import javax.media.protocol.ContentDescriptor;
 import javax.media.protocol.DataSource;
 import javax.media.protocol.FileTypeDescriptor;
 import javax.media.protocol.PullBufferDataSource;
 import javax.media.protocol.PullBufferStream;

 /**
  * This program takes a list of JPEG image files and convert them into a
  * QuickTime movie.
 */
 public class JpegImagesToMovie implements ControllerListener, DataSinkListener {

public boolean doIt(int width, int height, int frameRate, Vector inFiles,
        MediaLocator outML) throws MalformedURLException {
    ImageDataSource ids = new ImageDataSource(width, height, frameRate,
            inFiles);

    Processor p;

    try {
        //System.err
        //      .println("- create processor for the image datasource ...");
        p = Manager.createProcessor(ids);
    } catch (Exception e) {
        System.err
                .println("Yikes!  Cannot create a processor from the data source.");
        return false;
    }

    p.addControllerListener(this);

    // Put the Processor into configured state so we can set
    // some processing options on the processor.
    p.configure();
    if (!waitForState(p, p.Configured)) {
        System.err.println("Failed to configure the processor.");
        return false;
    }

    // Set the output content descriptor to QuickTime.
    p.setContentDescriptor(new ContentDescriptor(
            FileTypeDescriptor.QUICKTIME));

    // Query for the processor for supported formats.
    // Then set it on the processor.
    TrackControl tcs[] = p.getTrackControls();
    Format f[] = tcs[0].getSupportedFormats();
    if (f == null || f.length <= 0) {
        System.err.println("The mux does not support the input format: "
                + tcs[0].getFormat());
        return false;
    }

    tcs[0].setFormat(f[0]);

    //System.err.println("Setting the track format to: " + f[0]);

    // We are done with programming the processor. Let's just
    // realize it.
    p.realize();
    if (!waitForState(p, p.Realized)) {
        System.err.println("Failed to realize the processor.");
        return false;
    }

    // Now, we'll need to create a DataSink.
    DataSink dsink;
    if ((dsink = createDataSink(p, outML)) == null) {
        System.err
                .println("Failed to create a DataSink for the given output MediaLocator: "
                        + outML);
        return false;
    }

    dsink.addDataSinkListener(this);
    fileDone = false;

    System.out.println("Generating the video : "+outML.getURL().toString());

    // OK, we can now start the actual transcoding.
    try {
        p.start();
        dsink.start();
    } catch (IOException e) {
        System.err.println("IO error during processing");
        return false;
    }

    // Wait for EndOfStream event.
    waitForFileDone();

    // Cleanup.
    try {
        dsink.close();
    } catch (Exception e) {
    }
    p.removeControllerListener(this);

    System.out.println("Video creation completed!!!!!");
    return true;
}

/**
 * Create the DataSink.
 */
DataSink createDataSink(Processor p, MediaLocator outML) {

    DataSource ds;

    if ((ds = p.getDataOutput()) == null) {
        System.err
                .println("Something is really wrong: the processor does not have an output DataSource");
        return null;
    }

    DataSink dsink;

    try {
        //System.err.println("- create DataSink for: " + outML);
        dsink = Manager.createDataSink(ds, outML);
        dsink.open();
    } catch (Exception e) {
        System.err.println("Cannot create the DataSink: " + e);
        return null;
    }

    return dsink;
}

Object waitSync = new Object();
boolean stateTransitionOK = true;

/**
 * Block until the processor has transitioned to the given state. Return
 * false if the transition failed.
 */
boolean waitForState(Processor p, int state) {
    synchronized (waitSync) {
        try {
            while (p.getState() < state && stateTransitionOK)
                waitSync.wait();
        } catch (Exception e) {
        }
    }
    return stateTransitionOK;
}

/**
 * Controller Listener.
 */
public void controllerUpdate(ControllerEvent evt) {

    if (evt instanceof ConfigureCompleteEvent
            || evt instanceof RealizeCompleteEvent
            || evt instanceof PrefetchCompleteEvent) {
        synchronized (waitSync) {
            stateTransitionOK = true;
            waitSync.notifyAll();
        }
    } else if (evt instanceof ResourceUnavailableEvent) {
        synchronized (waitSync) {
            stateTransitionOK = false;
            waitSync.notifyAll();
        }
    } else if (evt instanceof EndOfMediaEvent) {
        evt.getSourceController().stop();
        evt.getSourceController().close();
    }
}

Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;

/**
 * Block until file writing is done.
 */
boolean waitForFileDone() {
    synchronized (waitFileSync) {
        try {
            while (!fileDone)
                waitFileSync.wait();
        } catch (Exception e) {
        }
    }
    return fileSuccess;
}

/**
 * Event handler for the file writer.
 */
public void dataSinkUpdate(DataSinkEvent evt) {

    if (evt instanceof EndOfStreamEvent) {
        synchronized (waitFileSync) {
            fileDone = true;
            waitFileSync.notifyAll();
        }
    } else if (evt instanceof DataSinkErrorEvent) {
        synchronized (waitFileSync) {
            fileDone = true;
            fileSuccess = false;
            waitFileSync.notifyAll();
        }
    }
}

/*public static void main(String args[]) {

    if (args.length == 0)
        prUsage();

    // Parse the arguments.
    int i = 0;
    int width = -1, height = -1, frameRate = 1;
    Vector inputFiles = new Vector();
    String outputURL = null;

    while (i < args.length) {

        if (args[i].equals("-w")) {
            i++;
            if (i >= args.length)
                prUsage();
            width = new Integer(args[i]).intValue();
        } else if (args[i].equals("-h")) {
            i++;
            if (i >= args.length)
                prUsage();
            height = new Integer(args[i]).intValue();
        } else if (args[i].equals("-f")) {
            i++;
            if (i >= args.length)
                prUsage();
            frameRate = new Integer(args[i]).intValue();
        } else if (args[i].equals("-o")) {
            i++;
            if (i >= args.length)
                prUsage();
            outputURL = args[i];
        } else {
            inputFiles.addElement(args[i]);
        }
        i++;
    }

    if (outputURL == null || inputFiles.size() == 0)
        prUsage();

    // Check for output file extension.
    if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
        System.err
                .println("The output file extension should end with a .mov extension");
        prUsage();
    }

    if (width < 0 || height < 0) {
        System.err.println("Please specify the correct image size.");
        prUsage();
    }

    // Check the frame rate.
    if (frameRate < 1)
        frameRate = 1;

    // Generate the output media locators.
    MediaLocator oml;

    if ((oml = createMediaLocator(outputURL)) == null) {
        System.err.println("Cannot build media locator from: " + outputURL);
        System.exit(0);
    }

    JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
    imageToMovie.doIt(width, height, frameRate, inputFiles, oml);

    System.exit(0);
}*/

static void prUsage() {
    System.err
            .println("Usage: Java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
    System.exit(-1);
}

/**
 * Create a media locator from the given string.
 */
static MediaLocator createMediaLocator(String url) {

    MediaLocator ml;

    if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
        return ml;

    if (url.startsWith(File.separator)) {
        if ((ml = new MediaLocator("file:" + url)) != null)
            return ml;
    } else {
        String file = "file:" + System.getProperty("user.dir")
                + File.separator + url;
        if ((ml = new MediaLocator(file)) != null)
            return ml;
    }

    return null;
}

// /////////////////////////////////////////////
//
// Inner classes.
// /////////////////////////////////////////////

/**
 * A DataSource to read from a list of JPEG image files and turn that into a
 * stream of JMF buffers. The DataSource is not seekable or positionable.
 */
class ImageDataSource extends PullBufferDataSource {

    ImageSourceStream streams[];

    ImageDataSource(int width, int height, int frameRate, Vector images) {
        streams = new ImageSourceStream[1];
        streams[0] = new ImageSourceStream(width, height, frameRate, images);
    }

    public void setLocator(MediaLocator source) {
    }

    public MediaLocator getLocator() {
        return null;
    }

    /**
     * Content type is of RAW since we are sending buffers of video frames
     * without a container format.
     */
    public String getContentType() {
        return ContentDescriptor.RAW;
    }

    public void connect() {
    }

    public void disconnect() {
    }

    public void start() {
    }

    public void stop() {
    }

    /**
     * Return the ImageSourceStreams.
     */
    public PullBufferStream[] getStreams() {
        return streams;
    }

    /**
     * We could have derived the duration from the number of frames and
     * frame rate. But for the purpose of this program, it's not necessary.
     */
    public Time getDuration() {
        return DURATION_UNKNOWN;
    }

    public Object[] getControls() {
        return new Object[0];
    }

    public Object getControl(String type) {
        return null;
    }
}

/**
 * The source stream to go along with ImageDataSource.
 */
class ImageSourceStream implements PullBufferStream {

    Vector images;
    int width, height;
    VideoFormat format;

    int nextImage = 0; // index of the next image to be read.
    boolean ended = false;

    public ImageSourceStream(int width, int height, int frameRate,
            Vector images) {
        this.width = width;
        this.height = height;
        this.images = images;

        format = new VideoFormat(VideoFormat.JPEG, new Dimension(width,
                height), Format.NOT_SPECIFIED, Format.byteArray,
                (float) frameRate);
    }

    /**
     * We should never need to block assuming data are read from files.
     */
    public boolean willReadBlock() {
        return false;
    }

    /**
     * This is called from the Processor to read a frame worth of video
     * data.
     */
    public void read(Buffer buf) throws IOException {

        // Check if we've finished all the frames.
        if (nextImage >= images.size()) {
            // We are done. Set EndOfMedia.
            //System.err.println("Done reading all images.");
            buf.setEOM(true);
            buf.setOffset(0);
            buf.setLength(0);
            ended = true;
            return;
        }

        String imageFile = (String) images.elementAt(nextImage);
        nextImage++;

        //System.err.println("  - reading image file: " + imageFile);

        // Open a random access file for the next image.
        RandomAccessFile raFile;
        raFile = new RandomAccessFile(imageFile, "r");

        byte data[] = null;

        // Check the input buffer type & size.

        if (buf.getData() instanceof byte[])
            data = (byte[]) buf.getData();

        // Check to see the given buffer is big enough for the frame.
        if (data == null || data.length < raFile.length()) {
            data = new byte[(int) raFile.length()];
            buf.setData(data);
        }

        // Read the entire JPEG image from the file.
        raFile.readFully(data, 0, (int) raFile.length());

        //System.err.println("    read " + raFile.length() + " bytes.");

        buf.setOffset(0);
        buf.setLength((int) raFile.length());
        buf.setFormat(format);
        buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);

        // Close the random access file.
        raFile.close();
    }

    /**
     * Return the format of each video frame. That will be JPEG.
     */
    public Format getFormat() {
        return format;
    }

    public ContentDescriptor getContentDescriptor() {
        return new ContentDescriptor(ContentDescriptor.RAW);
    }

    public long getContentLength() {
        return 0;
    }

    public boolean endOfStream() {
        return ended;
    }

    public Object[] getControls() {
        return new Object[0];
    }

    public Object getControl(String type) {
        return null;
    }
  }
}

そのdoIt関数は、main関数を持つ別のクラスから呼び出すことができます。

CreatVideo.Java

/*
 * To change this template, choose Tools | Templates
 * and open the template in the editor.
*/
package imagetovideo;

import Java.awt.Graphics2D;
import Java.awt.RenderingHints;
import Java.awt.image.BufferedImage;
import Java.io.File;
import Java.io.FilenameFilter;
import Java.io.IOException;
import Java.net.MalformedURLException;
import Java.util.Vector;
import javax.media.MediaLocator;


public class CreateVideo{

  public static final File dir = new File("D:\\imagesFolder\\");
  public static final String[] extensions = new String[]{"jpg", "png"};
  public static final FilenameFilter imageFilter = new FilenameFilter() {
    @Override
    public boolean accept(final File dir, String name) {
        for (final String ext : extensions) {
            if (name.endsWith("." + ext)) {
                return (true);
            }
        }
        return (false);
    }
};

// Main function 
public static void main(String[] args) throws IOException {
    File file = new File("D:\\a.mp4");
    if (!file.exists()) {
        file.createNewFile();
    }
    Vector<String> imgLst = new Vector<>();
    if (dir.isDirectory()) {
        int counter = 1;
        for (final File f : dir.listFiles(imageFilter)) {               
            imgLst.add(f.getAbsolutePath());             

        }
    }
    makeVideo("file:\\" + file.getAbsolutePath(), imgLst);        
}

 public static void makeVideo(String fileName, Vector imgLst) throws MalformedURLException {
    JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
    MediaLocator oml;
    if ((oml = imageToMovie.createMediaLocator(fileName)) == null) {
        System.err.println("Cannot build media locator from: " + fileName);
        System.exit(0);
    }
    int interval = 40;
    imageToMovie.doIt(720, 360, (1000 / interval), imgLst, oml);
 }  
}

要件

  • ライブラリフォルダーに jmf-2.1.1e.jar を含める(このライブラリを使用)
3
ANUJ SINGH