Codeproject ai coral reddit.
Codeproject ai coral reddit.
Codeproject ai coral reddit You can now run AI acceleration on OpenVINO and Tensor aka Intel CPUs 6th gen or newer or Yeah I have 3 (and one coming) 4K cameras with a res 2560x1440. AI and then let me know if you can start it again. I installed the drivers from the apps section but it still doesn't work. How’s the coral device paired with CP. 8 Beta version with YOLO v5 6. 1, I only get "call failed" no matter what verbosity I set. Don't mess with the modules. Should mesh be switched on on both PC,s Any thoughts? If I'm running BI (5. I use it in lieu of motion detection on cameras. 25 - 100ms with my T400. Mesh is ticked on in both. ai It took a while, but it seems that I have something running here now. AI) server all off my CPU as I do not have a dedicated GPU for any of the object detection. Get the Reddit app Scan this QR code to download the app now Also running it on a windows with a google coral setup and working great. AI Server. When asking a question or stating a problem, please add as much detail as possible. Stick to Deepstack if you have a Jetson. First , there's the issue of which modules I need for it to recognize specific objects. Search for it on YouTube! But in Object Detection (Coral) menu Test Result is this: AI test failed: ObjectDetectionCoral test not provisioned But I see this in the Codeproject. They must be the correct case and match the objects that the model was trained on. They are not expensive 25-60 USD but their seam to be always out of stock. I finally switched to darknet and got that enabled, but I'm not getting anything to trigger. AI Server in Docker or natively in Ubuntu and want to force the installation of libedgetpu1-max, first stop the Coral module from CodeProject. Even if you get it working, the models are not designed for cctv and have really poor detection. I removed all other modules except for what's in the screenshot assuming the Coral ObjectDetection is the only module I'd need. Has anyone found any good sources of information on how to use a Coral TPU with code project? I ask because my 6700t seems to struggle a bit(18% at idle, 90+ when motion detected) I only have 5 streams of 2mp cameras. 2 NVME drive that I was intending to use for the OS & DB. 0 was just released which features a lot of improvements, including a fresh new frontend interface It's hard to find benchmarks on this sort of thing, but I get 150ms to 500ms CodeProject. Any It appears that python and the ObjectDetectionNet versions are not set correctly. 2 for object detection. VM's and Management have their own dedicated 10Gbps SFP+ connections. 4-Beta). I was therefore wondering if people have found any creative use cases for the TPU with Blue Iris. remove everything under c:\programdata\codeproject\ai\ , also if you have anything under C:\Program Files\CodeProject\AI\downloads I got Frigate running on Unraid and have it connected to Home Assistant which is in a VM on my Unraid. I have it installed and configured as I would expect based upon tutorials. ai running alright. NET) 1. Try a Google Coral I’ve got one in a micro Optiplex, 6th gen i5, 16GB memory. Get the Reddit app Scan this QR code to download the app now Codeproject. I got it working - I had to use the drivers included as part of the Coral Module rather than the ones downloaded from Coral's website. Edit (5/11/2024): Here's the Coral/CP. I just installed Viseron last night and still tinkering with the config. at CodeProject. Ran Scrypted for most of this year. Overall it seems to be doing okay but I'm confused by a few things and having a few issues. AI on has 2 x Xeon E5-2640 V4's and 128GB of RAM. Double-Take: CodeProject. Creating a LLM Chat Module for CodeProject. py: TPU detected 17:11:43:objectdetection_coral_adapter. We would like to show you a description here but the site won’t allow us. Modify the registry (Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Perspective Software\Blue Iris\Options\AI, key 'deepstack_custompath') so Blue Iris looks in C:\Program Files\CodeProject\AI\AnalysisLayer\ObjectDetectionYolo\custom-models for custom models, and copy your models into there. CPU barely breaks 30%. AI, CompreFace, Deepstack and others. sounds like you did not have BI configured right as choppy video playback is not normal and no one i know sees that as an issue. json files in the module's directory, typically located at C:\Program Files\CodeProject\AI\modules\<ModuleName>\modulesettings. I have BI running for my business. 16) and codeproject. v2. If you look towards the bottom of the UI you should see all of CodeProject AI's modules and their status. 7. The modules included with CodeProject. Not super usefull when used with blueiris for ai detection. ai with google coral, but also have frigate for the home assistant integration and might take the time to dial in sending motion alerts from frigate to BI to get rid of CP. Inside Docker, I'm pulling in the codeproject/ai-server image. Thanks for you great insight! I have two corals (one mpcie and one m. If you have a specific Keyboard/Mouse/AnyPart that is doing something strange, include the model number i. 2 setup with dual coral? Which model to use (yolov5, yolov8, mobilenet, SSD), custom models, model size? Can you filter out stuff you don't need with coral models? Jul 27, 2024 · I've been trying to get this usb coral TPU running for far too long. Viseron is a self-hosted NVR deployed via Docker, which utilizes machine learning to detect objects and start recordings. net core 7 runtime and select Repair: On the main AI settings, check the box next to Use custom models and uncheck the box next to Default object detection. 2) they both are hanging there for nothing. It seems silly that Deepstack has been supporting a Jetson two years ago… it’s really unclear why codeproject AI seems to be unable to do so. 2 I'm seeing analyze times around 280ms with the small model and 500ms with the medium model. AI. AI for object detection at first, but was giving me a problem. Get the Reddit app Scan this QR code to download the app now Go to codeproject_ai r/codeproject_ai. So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. Installation runs through, and on the first start, it downloads stuff to install 3 initial modules, FaceProcessing, ObjectDetection (YOLOv5 . I'm using macvlan as the networking config to give it an IP on the LAN. I have seen there are different programs to accomplish this task like CodeProject. I recently switched from Deepstack AI to Code Project AI. AI with Blue Iris for nearly a year now, and after setting it up with my Coral Edge TPU couple of months ago, it has been amazing. CodeProject AI + the models bundled with Blue Iris worked a lot better for me compared to Frigate. Here is the analysis for the Amazon product reviews: Name: Google Coral USB Edge TPU ML Accelerator coprocessor for Raspberry Pi and Other Embedded Single Board Computers Company: Google Coral Amazon Product Rating: 4. Javascript So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. py: from module_runner import ModuleRunner The AI setting in BI is "medium". Now i've done a manual install of a fresh Debian 12 lxc and that works rock solid. Coral support is very immature on cpai, I would not recommend using it. I I am using the coral on my home assistant computer to offload some of the work and now the detection time is 15-60ms. I have codeproject. AI webpage it shows localhost:##### Is it fine to have these different? I went into the camera settings->Trigger->AI and turned on CP. 3. Works great with bi. py", line 10, in 07:52:22 bjectdetection_coral_adapter. 1MP): ~35ms Coral USB A (12. So the next step for me is setting up facial recognition since Frigate doesn't natively do this. " Restart the AI, heck, even BI: nothing. You can get a full Intel N100 system for $150 which will outperform a Coral in both speed and precision. More formal support for Code Project’s AI Server, now our preferred no-extra-cost AI provider over DeepStack. API This post was useful in getting BlueIris configured properly for custom models. AI also now supports the Coral Edge TPUs. ai is rumoured to soon support tensorlite and coral. json, where ModuleName is the name of the module. This will most likely change once CPAI is updated. py: File "C:\Program Files\CodeProject\AI\modules\ObjectDetectionCoral\objectdetection_coral_adapter. CodeProject AI and Frigate To start, I have a working Frigate config with about 10 cameras right now. Coral is ~0. Here we I had the same thing happen to me after a power loss. Javascript I had to install 2. Now when I try to intall Object Detection (Coral) module 2. AI detection times with my P620, probably on average around 250ms. 8 (I think?). I found that I had to install the custom model on both the windows computer that blueiris was running on in addition to the docker container that is running CodeProject AI in order for my custom model file to get picked up. AI, yes CodeProject was way slower for me but I don't know why, object type recognition was also way better with CodeProject. codeproject was not significantly better than deepstack at the time (4 months ago), but I guess many people have started migrating away from deepstack by now, and cp. The primary node I'm running Blue Iris as well as CodeProject. ai's forums, and nothing jumps out at me as things I have not tried. If you're new to BlueIris and CP. This worked for me for a clean install: after install, make sure the server is not running. I recently received the Coral TPU and have been trying to find ways to use it with my Blue Iris setup, however, it seems that CodeProject. 11 votes, 11 comments. Afterwards, The AI is no longer detecting anything. Original: Is there a guide somewhere for how to get CP. Hey guys, I've seen there is some movement about google coral TPU support in codeproject, and I was wondering if there is any way to make it work with Blue Iris NVR software. Computer Vision is the scientific subfield of AI concerned with developing algorithms to extract meaningful information from raw images, videos, and sensor data. AI only supports the use case of the Coral Edge TPU via the Raspberry PI image for Docker. Now if codeproject ai can just start recognizing faces. List the objects you want to detect. Didn't uninstall anything else. ¿Alguien tiene opiniones sobre estos dos? Configuré Deepstack hace aproximadamente un mes, pero leí que el desarrollador está… Creating a LLM Chat Module for CodeProject. Any idea what could cause that ? Coral module is correctly detected in the device manager. AI a try. Just switched back to Blue Iris. 2 nvme slot which is where I'm putting the Coral TPU then will use the only 2. AI container has started and fail to connect. The AI is breaking constantly and my CPU is getting maxed out which blows my mind as I threw 20 cores at this VM. I however am still having couple of scenarios that I'd like to get some help on and was hoping if there are any solutions worth exploring: I ended up buying an Intel NUC to run Frigate on separately, keeping the Wyse for HA. After Googling similar issues I found some solutions. Apr 22, 2024 · Does anyone happen to have any best practice recommendations for CP. I think maybe you need to try uninstalling DeepStack and CodeProject. Hey, it takes anywhere from 1-6 seconds depending on whether you use Low, Medium or High MODE on Deepstack in my experience. If in docker, open a Docker terminal and launch bash: I’m current running deep stack off my cpu and it isn’t great and rather slow. net module. 2023-12-10 15:30:38: ** App DataDir: C:\ProgramData\CodeProject\AI. Reply reply UncharacteristicZero 11/14/2022 5:11:51 PM - CAMERA02 AI: Alert cancelled [nothing found] 11/14/2022 5:09:12 PM - CAMERA02 AI: [Objects] person: 63%. when I installed the current version of cp ai. r/codeproject_ai Coral usb TPU set to full precision (didn Hey looking for a recommendation on best way to proceed. I was wondering if there are any performance gains with using the Coral Edge TPU for docker run --name CodeProject. AI completely, then rebooting and reinstalling the 2. 12 However, they use far more power. AI Server log shows requests every minute or less when there is no motion detection" This is a Fakespot Reviews Analysis bot. I"m using Nginx to push a self signed cert to most of my internal network services and I' trying to do the same for codeproject web ui. net , stuck on cpu mode, no toggle to gpu option? I was using Deepstack and decided to give Codeproject. 8) running in a Windows VM and CodeProject. 1:82 but on the CP. As mentioned also, I made a huge performance step by running deepstack on a docker on my proxmox host instead of running it in a windows vm. I have a USB Coral i'm trying to passthru to docker. In the past, I have tested this same PC with Coral but with Linux baremetal + frigate docker so I know this Mini PC should fully detected the TPU inside Windows. Uninstall Coral Module. I would like to try out Codeproject AI with BlueIris. 9. It is an AI accelerator (Think GPU but for AI). AI Server 4/4/2024, 7:13:00 AM by Matthew Dennis Create a ChatGPT-like AI module for CodeProject. 2 under the section marked "CodeProject. true I have Blue Iris (5. Coral is not particularly good anymore, as modern Intel iGPU has caught up and surpassed it. These are both preceded by MOTION_A Hello everyone. It does not show up when running lsusb and does show in the system devices as some generic device. Performance is mediocre - 250ms+ vs. Run asp. 8. I have them outside and instead of using the blue iris motion detection, I have a script that checks for motion every second on the camera web service and if there is motion, the script pulls down the image from the camera's http service, feeds it into deepstack and if certain parameters are met, triggers a recording. The backup node has 2 x Xeon E5-2667 V4's and 128GB of RAM. Restart AI to apply. Will this work? I see a lot of talk about running on a raspberry pi but not much about on ubuntu/docker on x86. For my security cameras, I'm using Blue Iris with CodeProject. Coral M. When i look at the BI logs, after a motion trigger it says "AI:Alert canceled [AI: not responding] 0ms" Any ideas? I'm on a windows machine running BI 5. Delete C:\Program Files\CodeProject Delete C:\ProgramData\CodeProject Restart Install CodeProject 2. Fakespot detects fake reviews, fake products and unreliable sellers using AI. Detection times are 9000ms-20000ms in BI. I don’t understand what exactly each system does and which of these (or other) tools I would need. 4-Beta) running as a Docker container on unRAID. e. When I reboot my unRAID server the Blue Iris VM will come online before the CodeProject. I'd like to keep this build as power efficient as possible, so rather than a GPU, I was going to take the opportunity to move to CodeProject AI with a Coral TPU. Coral's github repo last update is 2~3 yrs ago. Now AI stops detecting. AI Server is better supported by its developers and has been found to be more stable overall. 1. For PC questions/assistance. AI 2. I recently switched from Deepstack to CP AI. 10. I'm using Coral TPU plugged into the USB port to support CodeProject. 6. I've got it somewhat running now but 50% of the time the TPU is not recognized so it reverts to CPU and about 40% of the time something makes Codeproject just go offline. And from the moment you stop the service, it can take 20-30 seconds for the process to exit. I've set it up on Windows Server 2022 and it's working OK. AI, remember to read this before starting: FAQ: Blue Iris and CodeProject. On my i5-13500 with YOLOv5 6. AI setup for license plate reading). The small model found far more objects that all the other models even though some were wrong! 19 votes, 28 comments. I have a Nvidia 1050ti and a Coral TPU on a pci board (which I just put in the BI server since I've been waiting on Coral support. I hear about Blueiris, codeproject ai, frigate, synology surveillance station, and scrypted. If I were to upgrade to a A2000 what kind of gains would I expect? I've heard faster cards do not make that much of a difference with detection times. 2. Really sad the Codeproject. Relying on the uninstaller to stop the service and remove the files has been problematic because of this lag to terminate the process. Sep 30, 2023 · The camera AI is useful to many people, but BI has way more motion setting granularity than the cameras, and some people need that additional detail, especially if wanting AI for more than a car or person. I had CodeProject. Go back to "Install Modules" and re-install Coral Module. 8 - 2M cameras running main and sub streams. However, for the past week, the models field is empty. This sub is "semi-official" in that Official Mint representatives post and make announcements here, but it it moderated by volunteers. When I start the Object Detection (Coral), logs show the following messages: 17:11:17:Started Object Detection (Coral) module 17:11:43:objectdetection_coral_adapter. I have been running my Blue Iris and AI (via CodeProject. Hi does anyone know how mesh is supposed to work. Still same empty field. While there is a newer version of CodeProject. I've had Deepstack running on my mini server in a docker this way for years. Each module tells you if it's running and if it's running on the CPU or GPU. For the Docker setup, I'm running PhotonOS in a VM, with Portainer on top to give me a GUI for Docker. Short summary: No. AI -d -p 32168:32168 -p 32168:32168/UDP codeproject/ai-server The extra /UDP flag opens it up to be seen by the other instances of CP-AI and allows for meshing, very useful!!! That extra flag was missing in the official guide somewhere. They do not support the Jetson, Coral, or other low power GPU use. Running BI and Codeproject here in windows 11. 4 out of 5 are using substreams too. Blue Iris is running in a Win10 VM. There seems to be many solutions addressing different problems. 5 SATA SSD for the windows OS. AI Server v2. It's interesting to see alternatives to Frigate appearing, at least for object detection. For installation, I had to download the 2. The CodeProject status log is showing the requests, but the BlueIris log is not showing any AI requests or feedback, only motion detects. 12 votes, 30 comments. I finally got access to a Coral Edge TPU and also saw CodeProject. AI team have released a Coral TPU module so it can be used on devices other than the Raspberry Pi. Suddenly about a week ago, it started giving me an AI timeout or not responding. The second entry shows that BI sent a motion alert to the AI and the AI confirmed it was a person. AI(Deepstack) vs CompreFace So I've been using DT for a long time now. Revisiting my previous question here, I can give feedback now that'd I've had more time with codeproject. Go back to 2. Il semble que l'exécution prenne 150 à 160 ms, selon les journaux de l'interface Web de CodeProject AI. I am CONSTANTLY getting notificaitons on my phone, for all sorts of movement. AI and is there anything people can do to help? It works fine for my 9 cameras. Reply reply I ended up reinstalling the coral module, and also under BI Settings ->AI i put the ip address of the pc running BI for the Use AI Server on IP/Port: and port 5000. Despite having my gpu passed through, visible in windows, and Code project is seeing my gpu as well. I have blue iris on a NUC and it is averaging 900ms for detection. 2 GPU CUDA support Update Speed issues are fixed (Faster then DeepStack) GPU CUDA support for both… I use CodeProject AI for BI, only the object detection. I've switched back and forth between CP and CF tweaking the config trying to get the most accuracy on facial recognition. My preference would be to run Codeproject AI with Coral USB in a docker on a Ubuntu x86 vm on Proxmox. I uninstalled BlueIris aswell as CodeProject and re-setup everything, but it still doesnt work. I want to give it GPU support for CodeProject as i have 15 cameras undergoing AI analysis. 13 as available for the last couple weeks. I had Deepstack working well and when Codeproject came out and I heard Deepstack was being deprecated, I made an image, then installed it. I see in the list of objects that cat is supported, but I'm not sure where to enter "cat" to get it working. I have about 26 cameras set up that are set to record substream continuously direct to disk recording with most cameras using INTEL +VPP for hardware decoding. 0. Apr 23, 2023 · I have been running my Blue Iris and AI (via CodeProject. Posted by u/nos3001 - 8 votes and 12 comments Hi Chris, glad you've set up a sub, as I personally really struggle with the board - takes few back to usenet days lol. Her tiny PC only has 1 m. AI 1. 2 and used YOLOv5. Now this is working as I see the web codeproject web interface when accessing the alternate dns entry I made pointing to Nginx proxy manager but in the web page, under server url I also see the alternate dns entry resulting in not showing the logs. Blue Iris is a paid product, but it's essentially a once-off payment (edit: you do only get one year of updates though). CodeProject. Running CodeProject. Has anyone managed to get face recognition working? I tried it many moons ago, but it was very flaky, it barely saved any faces and I ended giving up. Am I missing something there, am i also missing a driver or setting to get the integrated 850 quick sync to work with v5. CodeProject AI should be adding Coral support soon. The Coral would fit, but I believe there are issues with the Wyse being an AMD CPU for Frigate (there might be comments to this effect on this post to that effect, I can't remember and on my phone, but certainly worth having a dive into that issue first). Comparing similar alerts AI analysis between DeepStack and CodeProject. But to process already trained network in any resemblance of real time, you can't use CPU ( too slow even on big PCs), GPU (Graphic card can't fit to Raspberry Pi, or smaller edge devices ) therefore TPU, a USB dongle like device, that took the AI processing part out of graphic card (on smaller scale) and allows you to execute AI stuff directly Please first read the Mint Mobile Reddit FAQ that is stickied and linked in the sub about and sidebar, as this answers most questions posted in this sub. It looks like Frigate is the up-and-coming person and object detection AI and NVR folks should consider. Within Blue Iris, go to the settings > "AI" tab > and click open AI Console. Am hoping to use it once it supports Yolo and custom models, but that is a while off. AI running with BI on a windows machine? We would like to show you a description here but the site won’t allow us. AI (2. 4W idle and 2W max, whereas a graphics card is usually at least 10W idle and can go far higher when in use. Now for each camera, go to the settings, then click the AI button. If code project ai added coral i would give it a try. Hopefully performance improves because I understand performance is better on Linux than Windows? I have codeproject AI's stuff for CCTV, it analyzes about 3-5x 2k resolution images a second. Usually the Deepstack processing is faster than taking the snapshot, because for whatever reason the SSS API takes 1-2 seconds to return the image (regardless of whether it's using high quality/balanced/low). If you plan to use custom models, I'd first disable the standard object model. So I assume I am doing something wrong there. I haven't had reliable success with other versions. By default, Frigate uses some demo ML models from Google that aren't built for production use cases, and you need the paid version of Frigate ($5/month) to get access to better models, which ends up more expensive than Blue Iris. The first entry shows that BI sent a motion alert to AI but the AI found nothing. Coral over USB is supposedly even worse. If you want all the models, just type *. This should pull up a Web-based UI that shows that CPAI is running. My little M620 GPU actually seems to be working with it too. If you had a larger computer that you could have a GPU with CUDA cores, you probably won’t need the coral. I have a coral device but stopped using it. I have a 2nd PC with codeproject running on the same ip:port (Cp standard) and same yolov5. Short story is I decided to move my BlueIris out of my Xeon EXSi VM server and into its own dedicated box. ai isn't worse either, so it may not matter. The PIP errors will look something like this: Turn off all Object Detection Modules. If you're running CodeProject. Yes, you can include multiple custom models for each camera (comma separated, no spaces, no file extension). I have an i7 CPU with built It's also worth noting that the Coral USB stick is no longer recommended. net Waited for them to be installed. Looking to hear from people who are using a Coral TPU. When I open CodeProject, I get: Dec 11, 2020 · Some interesting results testing the tiny, small, medium and large MobileNet SSD with the same picture. Both BI and AI are running inside a Windows VM on an i7-7700 with allocated 6 cores and 10GB of RAM, no GPU. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck BlueIris with Codeproject AI is awesome. Ai? Any improvements? Mar 9, 2021 · I've been using the typical "Proxmox / LXC / Docker / Codeproject" with Coral TPU usb passthough setup but it's been unreliable (at least for me) and the boot process is pretty long. py: Using Edge TPU Coral USB A (2. Works great now. AI available I found it has issues self configuring. 4 package. I’m still relatively new to codeproject and blueiris working together, currently I have a Coral dual tpu running on the same machine as blue iris and it seems to be doing a phenomenal job detecting usually less than 10ms but sometimes 2000+ for the most random objects like an airplane, I usually don’t park any in my backyard and if there is one then by the time I get that notification I I used the unraid docker for codeproject_ai and swapped out the sections you have listed. Manjaro is a GNU/Linux distribution based on Arch. Rob from the hookup just released a video about this (blue iris and CodeProject. The CodeProject. For other folks who had ordered a Coral USB A device and are awaiting delivery I placed the order 6/22/22 from Mouser and received today 10/17/22. AI has an license plate reader model you can implement. 4 By default you'll be using the standard object model. In BI on the AI tab, if i check off custom models, it keeps saying stop the server and restart to populate, but this doesnt succeed in populating. ai developers have not prioritized low cost/high output GPU TPU. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck I bought the Coral TPU coprocessor It is worth pointing out that they support other models and AI acceleration now. A rolling release distro featuring a user-friendly installer, tested updates and a community of friendly users for support. While I am not computer savvy, I have looked through the logs before crashes to see if anything pop out and there doesn't seem to be anything out of the ordinary. AI FOR ALL! MUHAHAH For Frigate to run at a reasonable rate you really needed a Coral TPU. 1 and ObjectDetection (YOLOv5 6. I played with frigate a little bit. AI are going to add Coral support at some point. believe I ran the batch file too. Oct 8, 2019 · 07:52:22 bjectdetection_coral_adapter. ai. Get the Reddit app Scan this QR code to download the app now i have been trying to spin up a codeproject/ai-server container with a second google coral but it I've so far been using purely CPU based DeepStack on my old system, and it really stuggles - lots of timeouts. Problem: They are very hard to get. It seems codeproject has made a lot of progress supporting coral TPU, so I was hoping things are a bit better now? Is anyone able to make it work? Credit for this work around goes to PeteUK on the codeproject discusions. This community is home to the academics and engineers both advancing and applying this interdisciplinary field, with backgrounds in computer science, machine learning, robotics View community ranking In the Top 10% of largest communities on Reddit CodeProject unable to install module I'm getting this, tried removing windows python, reinstalled it a few times. The strange thing is nvidia-smi says the graphics card is "off" and does not report any scripts running. One note, unrelated to the AI stuff: I messed around with actively cooled RPi4s + heatsinks for ages, before moving to this passively cooled case which works significantly better and has the added bonus of no moving parts. AI Server that handles a long-running process. AI Server Hardware. 2) 1. ESP32 is a series of low cost, low power system on a chip microcontrollers with integrated Wi-Fi and dual-mode Bluetooth. Is anyone using one of these successfully? The device is not faulty, works fine on my Synology i'm trying to migrate off of. How is the Tesla P4 working for you with CodeProject AI? Do you run CodeProject on Windows or Docker? Curious because I am looking for a GPU for my windows 10 CodeProject AI setup CodeProject AI has better models out-of-the-box. I installed the custom models (ipcams*) and it worked well for a while. I have been using CodeProject. From CodeProject UI the Coral module is using the YOLOv5 models at medium size. AI setup I've settled with for now. Or check it out in the app stores TOPICS Multiple ai models codeproject ai . Very quick and painless and it worked great! That was a over a month ago. Should I expect a better performance when running AI in docker? One thing about CP AI is that you have to stop the service before installing a new version. Getting excited to try CodeProject AI, with the TOPS power of coral, what models do you think it can handle the best? thank you! I have blue iris on a NUC and it is averaging 900ms for detection. Clicking the "" says "Custom models have been added. One thing I noticed. 6 Check AI Dashboard Press Ctrl R to force reload the dashboard Should see Modules installing I stopped YOLOv5 6. 2023-12-10 15:30:38: Video adapter info: Welcome to the IPv6 community on Reddit. For folks that want AI and alerts on animals or specifically a UPS truck then they need the additional AI that comes from CodeProject. Depending on markup it could be cheaper to get a decent graphics card which supports both the AI detection and ffmpeg acceleration. Everything was running fine until I had the bad idea to upgrade CodeProject to 2. I have read the limited threads on reddit, IPCamTalk, Codeproject. Get the Reddit app Scan this QR code to download the app now. ). Mise à jour : je viens d'essayer Coral + CodeProject AI et cela semble bien fonctionner ! J'ai ré-analysé certaines de mes alertes (clic droit sur la vidéo -> Tests et réglages -> Analyser avec l'IA) et la détection a bien fonctionné. Here's my setup: At the base I'm running ESXi. However - it doesn't look like it is doing anything and BI shows new items in alerts when I walk around a camera - but then they go away. Uninstall, Delete the database file in your C:\ProgramData\CodeProject folder and then delete the CodeProject folders under program files, then reboot, then reinstall CP. The CodeProject. Apr 22, 2024 · Edit: This conversation took a turn to focus solely more on Google Coral TPU setups, so editing the title accordingly. I then followed the advice: uninstalling codeproject, deleting its program files and program data folders, making sure BI service was not automatically restarting upon reboot, rebooting, reinstalling codeproject, and installing AI modules before starting BI. AI are configued via the modulesettings. 2 dual TPU. (tried YOLOv8 too) I'm still trying to understand the nuance of Coral not supporting custom models with the most recent updates since it acts like CodeProject is using the Coral device with the custom models from MikeLud. I have it running on a VM on my i3-13100 server, CPU-only objectDetection along with a second custom model, and my avg watt/hr has only increased by about 5w. I don’t think so, but CodeProject. . Is this latency too long given the hardware? One option is to run the AI in a docker container inside a Linux VM (on the same hardware). But my indoor cameras, I'd like to try using it for person and cat. AI Dashboard: 19:27:24:Object Detection (Coral): Retrieved objectdetection_queue command 'detect' It defaulted to 127. Posted by u/GiantsJets - 8 votes and 40 comments May 13, 2020 · This is documented in the codeproject AI blue iris faq here : Blue Iris Webcam Software - CodeProject. The ESP32 series employs either a Tensilica Xtensa LX6, Xtensa LX7 or a RiscV processor, and both dual-core and single-core variations are available. Anyway, top question for me, as my own Coral has just finally arrived, how goes support for Coral with CodeProject. Im attaching my settings aswell as pictures of the logs. Clips and recordings will all be placed on a NAS. My driveway camera is great, it's detecting people and cars. Free Frigate open source combined with a $30 Coral card turns any legacy computer into a top end NVR. I have CodeProject AI running in docker on linux. They self configure. AI, and apparently CodeProject. Been running on the latest versions of 0. ai (2. Clean uninstall/reinstall. net and it detects ok but slow. 0MP): ~200ms Obviously these are small sample sizes and YMMV but I'm happy with my initial tests/Blue Iris coral performance so far. It already has an M. the installer never opens a co Sadly codeproject ai it’s not very environmentally or budget friendly. I have BI on one PC with codeproject ai setup on yolov5. 5. Thanks for this. Will keep an eye on this. kab gyd tclh sufq lehmq knu ftrv zzartza kmkk uxn