Upcoming maintenance
Dear Customers and Partners.
This website will be undergoing scheduled maintenance on June 14, 2023. Please be aware there may be disruption to the developer portal website and associated services during the scheduled maintenance period.
This upgrade is essential to ensure the continued performance, reliability, and security of Developer World.
We apologize for any inconvenience.
Continous Data-streaming using spresnese camera
-
Hi,
I wanted to know how can I produce a constant video stream from spresense camera module into an iot cloud or online-drive. Could anyone please suggest an example/tutorial for the same.Thank you
Regards,
Nihal -
@neo11-0 I wanted to do the same. I just tried around a little bit.
The point is that you need a fast connection. Connecting a Wifi module via UART (like an ESP32 with the AT firmware) is far too slow. You need to connect a module with at least SPI or SDIO, which is effort. (which I did not take)It depends on your requirements. Do you want a 320x240 1 fps or a full HD 30 fps ...
Getting a video stream is easy. Just see the Camera examples. If you use the LTE-M board, have a look at the examples. Raise the TX buffers etc.
Do you have any protocol in mind? Might get tricky ...
If you have a cheap LTE-M SIM card you cannot connect from the outside. That is for premium prices due to some NAT routing.
Keep in mind that iot clouds do not permit image large messages. Upload with HTTP POST might be something simple.Will you have Wifi at home or not?
A lot of questions to think about
-
@jens6151-0-1-1 yes, but like SPI and SDIO are quite complicated, can you please share a document or so through which I can at least show the static images using spresense (with a little low refresh rate) ? like I could find many examples for esp CAM module but couldn't find much for spresense.
I would be very grateful to you if you could share me the progress with esp over uploading static images.
Thanks
-
@neo11-0 I would really need more info to make meaningful suggestions.
- Do you use the Arduino or Spresense SDK?
- Do you have the LTE-M board or use any other hardware for network communication? Which one?
- What is the could server you connect to?
- What is the protocol you use?
- Do you expect to see a camera stream opening a website?
Unfortunately, I just wanted to do the same. I did not do it yet ... Maybe I will do it in May.
I said I tried around a little bit. I used the Arduino SDK with the LTE-M board connected to AWS IoT cloud with MQTT. I have no code to retrieve the images. Not yet too much useful. -
@jens6151-0-1-1 as of now Arduino, using ESP8366 for communication, as for server and host not decided (which ever works) and on the website i would like to have images at most 30 minutes old.
-
@neo11-0 I do not have time at the moment to make a step-by-step guide. Please use this as a reference. It is just a copy of the memo I took months ago. If it does not work out of the box, please adapt it.
- Get the ESP8266 toolchain ready e.g. https://docs.espressif.com/projects/esp8266-rtos-sdk/en/latest/get-started/macos-setup.html
- Build and install the AT firmware e.g. https://docs.espressif.com/projects/esp-at/en/release-v2.2.0.0_esp8266/Compile_and_Develop/How_to_clone_project_and_compile_it.html
- Important! Modify the factory_param_data.csv so that you select a PIN that is available on your port. Choose the baud rate.
The issues with the ESP8266 approach are that it is very slow in transmitting data and that there is a limitation on how much can be sent at a time.
Please take it as a reference and reduce unnecessary stuff.
Getting the time is important when you use ssl.webCam.ino
#include "camera_sensor.h" #include "mqtt_module.h" #include "my_log.h" void setup() { Serial.begin(115200); while (!Serial) { } setup_log(); MqttModule.begin(); CameraSensor.begin(); } void loop() { pollMqtt(); }
camera_sensor.cpp
#include "camera_sensor.h" #include <Camera.h> #include <RTC.h> #include "mqtt_module.h" #include "my_log.h" #define CONFIG_JPEG_BUFFER_SIZE_DIVISOR 3 #define CONFIG_JPEG_QUALITY 75 void CameraSensorClass::printError(enum CamErr err) { if (DEBUG_CAMERA) Log.error("CameraSensorClass Error: "); switch (err) { case CAM_ERR_NO_DEVICE: if (DEBUG_CAMERA) Log.errorln("No Device"); break; case CAM_ERR_ILLEGAL_DEVERR: if (DEBUG_CAMERA) Log.errorln("Illegal device error"); break; case CAM_ERR_ALREADY_INITIALIZED: if (DEBUG_CAMERA) Log.errorln("Already initialized"); break; case CAM_ERR_NOT_INITIALIZED: if (DEBUG_CAMERA) Log.errorln("Not initialized"); break; case CAM_ERR_NOT_STILL_INITIALIZED: if (DEBUG_CAMERA) Log.errorln("Still picture not initialized"); break; case CAM_ERR_CANT_CREATE_THREAD: if (DEBUG_CAMERA) Log.errorln("Failed to create thread"); break; case CAM_ERR_INVALID_PARAM: if (DEBUG_CAMERA) Log.errorln("Invalid parameter"); break; case CAM_ERR_NO_MEMORY: if (DEBUG_CAMERA) Log.errorln("No memory"); break; case CAM_ERR_USR_INUSED: if (DEBUG_CAMERA) Log.errorln("Buffer already in use"); break; case CAM_ERR_NOT_PERMITTED: if (DEBUG_CAMERA) Log.errorln("Operation not permitted"); break; default: break; } } static void streamImage(uint8_t *image_buffer, size_t size) { MqttModule.mqtt_sendImage(image_buffer, size); } static void streamHandler(CamImage img) { if (img.isAvailable()) { if (DEBUG_CAMERA) Log.traceln("streamHandler isAvailable"); streamImage(img.getImgBuff(), img.getImgSize()); sleep(10); } else { if (DEBUG_CAMERA) Log.errorln("Video stream image not yet available"); } } void CameraSensorClass::enableStream(bool enable) { if (enable) { CamErr err = theCamera.startStreaming(true, streamHandler); if (err != CAM_ERR_SUCCESS) { printError(err); } } else { theCamera.startStreaming(false); } } void CameraSensorClass::begin() { if (!mInitialized) { CamErr err; if (DEBUG_CAMERA) Log.traceln("initCamera"); err = theCamera.begin(1, CAM_VIDEO_FPS_30, CAM_IMGSIZE_QVGA_H, CAM_IMGSIZE_QVGA_V, CAM_IMAGE_PIX_FMT_JPG, CONFIG_JPEG_BUFFER_SIZE_DIVISOR); if (err != CAM_ERR_SUCCESS) { printError(err); } if (DEBUG_CAMERA) Log.traceln("Default camera JPEG quality was %d. Set it to %d", theCamera.getJPEGQuality(), CONFIG_JPEG_QUALITY); theCamera.setJPEGQuality(CONFIG_JPEG_QUALITY); /* Auto white balance configuration */ if (DEBUG_CAMERA) Log.traceln("Set Auto white balance parameter"); err = theCamera.setAutoWhiteBalanceMode(CAM_WHITE_BALANCE_DAYLIGHT); if (err != CAM_ERR_SUCCESS) { printError(err); } err = theCamera.setStillPictureImageFormat(CAM_IMGSIZE_QUADVGA_H, CAM_IMGSIZE_QUADVGA_V, CAM_IMAGE_PIX_FMT_JPG, CONFIG_JPEG_BUFFER_SIZE_DIVISOR); if (err != CAM_ERR_SUCCESS) { printError(err); } mInitialized = true; if (DEBUG_CAMERA) Log.traceln("CameraSensorClass::initCamera complete"); } } void CameraSensorClass::end() { if (mInitialized) { theCamera.end(); mInitialized = false; } } void CameraSensorClass::takeImage(int width, int height) { if (DEBUG_CAMERA) Log.traceln("Enter take_image"); CamImage img = theCamera.takePicture(); if (DEBUG_CAMERA) Log.traceln("took a picture"); if (img.isAvailable()) { if (DEBUG_CAMERA) Log.traceln("Picture was available. Size = %d", img.getImgSize()); streamImage(img.getImgBuff(), img.getImgSize()); } else { if (DEBUG_CAMERA) Log.errorln("Failed to take picture"); } } CameraSensorClass CameraSensor;
camera_sensor.h
#pragma once #include <Camera.h> class CameraSensorClass { public: void begin(void); void end(void); void enableStream(bool enable); void takeImage(int width = 0, int height = 0); private: void printError(enum CamErr err); bool mInitialized = false; }; extern CameraSensorClass CameraSensor;
mqtt_module.cpp
#include "mqtt_module.h" #include "camera_sensor.h" #include "my_log.h" #define ESP_SERIAL_PORT Serial2 #include <AT.h> #include <EspATMQTT.h> #include <WiFiEspAT.h> EspATMQTT mqtt(&ESP_SERIAL_PORT); WiFiClient client; #define TCP_ADDR "192.168.10.3" #define TCP_PORT 20000 #define BROKER_NAME "192.168.10.2" #define BROKER_PORT 1883 #define BROKER_PASS "" #define BROKER_USER "" #define BROKER_SCHEME ESP_MQTT_SCHEME_MQTT_OVER_TCP #define BROKER_RECONNECT 0 #define BROKER_TIMEOUT 5000 #define BROKER_KEEP_ALIVE 280 #define MQTT_TOPIC_Image "jens/feeds/birdroom.image" #define MQTT_TOPIC_Ctrl "jens/feeds/birdroom.control" #define STACKSIZE 1024 enum mqttCommand { MQTT_CONTROL_START_STREAM, MQTT_CONTROL_STOP_STREAM, MQTT_CONTROL_TAKE_IMAGE }; #define TRANSFER_CHUNCK_SIZE 24000 void MqttModuleClass::mqtt_sendImage(uint8_t *image_buffer, size_t size) { if (DEBUG_MQTT) Log.traceln("MqttModuleClass::mqtt_sendImage"); uint32_t chunk_size = TRANSFER_CHUNCK_SIZE; uint32_t transactionLength = 0; uint8_t *pos = image_buffer; mqtt.pubString(DEFAULT_LINK_ID, MQTT_TOPIC_Image, "Start"); for (uint32_t offset = 0; offset < size; offset += transactionLength) { transactionLength = (size - offset) > chunk_size ? chunk_size : (size - offset); mqtt.pubRaw(DEFAULT_LINK_ID, MQTT_TOPIC_Image, (char *)pos, transactionLength, 0, 0); // client.write(pos, transactionLength); pos += transactionLength; } mqtt.pubString(DEFAULT_LINK_ID, MQTT_TOPIC_Image, "End"); // client.stop(); } void pollMqtt() { mqtt.process(); } void sub_cb(char *topic, char *mqttdata) { if (DEBUG_MQTT) Log.traceln("sub_cb topic=%s, message=%s", topic, mqttdata); int command = String(mqttdata).toInt(); if (command == MQTT_CONTROL_START_STREAM) { CameraSensor.enableStream(true); } else if (command == MQTT_CONTROL_STOP_STREAM) { CameraSensor.enableStream(false); } else if (command == MQTT_CONTROL_TAKE_IMAGE) { CameraSensor.takeImage(); } } void connected(char *connectionString) { if (DEBUG_MQTT) Log.traceln("Received connection string: %s", connectionString); mqtt_status_t mqttStatus = mqtt.subscribeTopic(sub_cb, DEFAULT_LINK_ID, MQTT_TOPIC_Ctrl, 0); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.subscribeTopic = %d", mqttStatus); } void got_ntp_time(char *dateTime) { mqtt.enableNTPTime(false, NULL, 0, ""); sleep(1); mqtt_status_t mqttStatus = mqtt.userConfig(DEFAULT_LINK_ID, BROKER_SCHEME, "birdcam0", BROKER_USER, BROKER_PASS); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.userConfig = %d", mqttStatus); sleep(1); mqttStatus = mqtt.connectionConfig(DEFAULT_LINK_ID, BROKER_KEEP_ALIVE, 0, "", ""); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.connectionConfig = %d", mqttStatus); sleep(1); mqttStatus = mqtt.connect(DEFAULT_LINK_ID, BROKER_NAME, BROKER_PORT, BROKER_RECONNECT, BROKER_TIMEOUT, connected); if (mqttStatus == ESP_AT_SUB_CMD_CONN_SYNCH) { // In case we make a direct connection we end up here. if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.connect = %d, Synchronously connected", mqttStatus); mqttStatus = mqtt.subscribeTopic(sub_cb, DEFAULT_LINK_ID, MQTT_TOPIC_Ctrl, 0); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.subscribeTopic = %d", mqttStatus); } else if (mqttStatus == ESP_AT_SUB_CMD_CONN_ASYNCH) { if (DEBUG_MQTT) Log.traceln( "MqttModuleClass::init mqtt.connect = %d, Asynchronously connected, waiting for callback", mqttStatus); } else { Log.errorln("An error occured (%d) during connection!", mqttStatus); } } void MqttModuleClass::begin() { if (!mInitialized) { if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init"); ESP_SERIAL_PORT.begin(115200 * 2); WiFi.init(&ESP_SERIAL_PORT); WiFi.setPersistent(true); int wifiStatus = WiFi.begin("ssid", "password"); // client.connect(TCP_ADDR, TCP_PORT); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init WiFi status %d", wifiStatus); mqtt_status_t mqttStatus = mqtt.begin(); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.begin = %d", mqttStatus); mqtt.enableNTPTime(true, got_ntp_time, 8, "ntp.nict.jp"); if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init mqtt.enableNTPTime = %d", mqttStatus); mInitialized = true; if (DEBUG_MQTT) Log.traceln("MqttModuleClass::init complete"); } } void MqttModuleClass::end() { if (DEBUG_MQTT) Log.traceln("MqttModuleClass::end"); if (mInitialized) { Serial2.printf("AT+MQTTUNSUB=0,\"%s\"\r\n", MQTT_TOPIC_Ctrl); Serial2.write("AT+MQTTCLEAN=0\r\n"); } } MqttModuleClass MqttModule;
mqtt_module.h
#pragma once #include "Camera.h" class MqttModuleClass { public: void begin(); void end(); void mqtt_sendImage(uint8_t *image_buffer, size_t size); private: bool mInitialized = false; }; extern MqttModuleClass MqttModule; void pollMqtt();
my_log.cpp
#include "my_log.h" #include <ArduinoLog.h> void printTimestamp(Print* _logOutput) { // Division constants const unsigned long MSECS_PER_SEC = 1000; const unsigned long SECS_PER_MIN = 60; const unsigned long SECS_PER_HOUR = 3600; const unsigned long SECS_PER_DAY = 86400; // Total time const unsigned long msecs = millis(); const unsigned long secs = msecs / MSECS_PER_SEC; // Time in components const unsigned long MiliSeconds = msecs % MSECS_PER_SEC; const unsigned long Seconds = secs % SECS_PER_MIN; const unsigned long Minutes = (secs / SECS_PER_MIN) % SECS_PER_MIN; const unsigned long Hours = (secs % SECS_PER_DAY) / SECS_PER_HOUR; // Time as string char timestamp[20]; sprintf(timestamp, "%02d:%02d:%02d.%03d ", Hours, Minutes, Seconds, MiliSeconds); _logOutput->print(timestamp); } void printLogLevel(Print* _logOutput, int logLevel) { // Show log description based on log level switch (logLevel) { default: case 0: _logOutput->print("SILENT "); break; case 1: _logOutput->print("FATAL "); break; case 2: _logOutput->print("ERROR "); break; case 3: _logOutput->print("WARNING "); break; case 4: _logOutput->print("INFO "); break; case 5: _logOutput->print("TRACE "); break; case 6: _logOutput->print("VERBOSE "); break; } } void printPrefix(Print* _logOutput, int logLevel) { printTimestamp(_logOutput); printLogLevel(_logOutput, logLevel); } void setup_log() { Log.setPrefix(printPrefix); Log.begin(LOG_LEVEL, &Serial); Log.setShowLevel(false); }
my_log.h
#pragma once #include <ArduinoLog.h> #define DEBUG_CAMERA 1 #define DEBUG_MQTT 1 #define LOG_LEVEL LOG_LEVEL_VERBOSE void setup_log();