Announcement

Collapse
No announcement yet.

IMS and Home Assistant Communication

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Bruce Cloutier View Post
    45 years ago equipment came on the market and they touted having RS-232 capability for connectivity (although that word I think wasn't known at that point). Missing was the fact that you had to have the right cable (connectors and wiring); You had to have the right baud rate, stop bits, parity; You had to have a protocol for requesting data; And, You needed to know how to decode the messages whether it be ASCII or binary. The point is that even with MQTT there remains something you need to know to get it to work. Maybe this is where AI comes in? Something to implement that DOWIM (Do What I Mean) machine instruction.
    Well, there is something like that for AI nowadays, it's called MCP. If you haven't already, you may even want to look into it. Would be cool to have an MCP server running on a JNIOR...

    Comment


    • #32
      MCP? Like from Tron? Ugh!

      Only a limited percentage of JNIOR installations have Internet access which this would require. It just seems dangerous in a controls/automation environment. I get it, still would be interesting.

      Anyone can write a server for the JNIOR. It is not a completely closed system. We would happily show you how it is done.
      Last edited by Bruce Cloutier; Yesterday, 09:45 AM.

      Comment


      • #33
        Originally posted by Bruce Cloutier View Post
        MCP? Like from Tron? Ugh!
        I guess they call it Ares now, in the latest reboot/sequel

        But yeah... It's Model Context Protocol. It's a generic REST-based protocol that allows an AI agent to discover the capabilities of an interface and make calls via a somewhat standardized way. So, your MCP server essentially tells the agent what your MCP server can offer in functionality and how to call those functions.

        It's the closest thing to a "Do What I Want" function we have, or what I colloquially used to call the "CEO button" in the past: The CEO button was the button that automatically read the CEO's mind and did exactly what the CEO wanted. We've never finished that project... unfortunately.

        And if it's not working, you simply need to tell your AI that it will go to jail...

        Comment


        • #34
          At least he could solve his 2x2x2 Rubik's Cube.

          Um... Somebody will get us back on topic. I hope.

          Comment


          • #35
            I would like to remind everyone here how much our industry is reliant on open source software.

            Cinema today has a foundation on open source. Without it, we would not be here now.
            1. ISDCF library, created by John Hurst (And others), is the KEY software that allows digital cinema to operate today. Without it, the industry would have devolved into chaos, as the cost of film would likely have resulted in theatrical being commercially far less viable. (We are already in a viability crisis anyway, imagine if we still used film, or proprietary digital standards.). There are reasons that the industry pulled out all the stops to make sure ATMOS didn't become the immersive audio standard, being IAB now. They understood, the industry simply could not afford it.

            2. SMPTE and the creation of the DCP standards. This has significantly greased the path for art house and alternate content. Key factors in todays ever fragmenting market of consumer preferences. (DCP-O-Matic being a key open source software project because of this)

            From my perspective, taking the industry down the road of dominant open source automation technologies makes complete sense, as it too will democratise and reduce costs on maintenance. Nonproprietary control interfaces and devices etc. should result. They should be cheaper and more importantly easier (cheaper) to maintain.

            The biggest push back on this will be. Integrators do not like supporting these technologies as if it reduces costs/turnover, it reduces profit as cost is a percentage of turnover. They have no incentive to adopt these technologies, in reality they have the opposite. It will have to come from smarter independents who like to get their hands dirty or larger chains who are their own integrator and understand the benefits long term.

            Self maintenance, back in the film days this was the norm, that smaller independents loves servicing their own projectors. Not so much now, it's all IT. It's a completely different skill now. We will need a generational change. @Marcel being an example.

            My vote on this topic, keep up the interest in Home Assistant, and please do post here the wins and advancements discovered.

            Comment


            • #36
              James, your perspective and mine are at odds on most of your points.

              IAB came about because Dolby allowed it and did see a benefit to not locking in as a proprietary format. My guess is that Ioan, who wants nothing but the best for this industry is likely the key reason for that. However, call it what you want, we effectively have Atmos for IAB. I don't know if Auro is dead yet but I don't hear ANYTHING about it anymore. And, while DTS-X is there..it is but a most tiny fraction of the market. It is also less regulated in terms of how it is implemented. There really is just one immersive audio system with the illusion of competition. At least as it stands now. GDC is certainly giving it their all to promote their form of DTS-X

              If film was still the medium of cinema, it is really difficult to say where the industry would be. Exhibitors would have a lot more cash on-hand as the cost of buying and maintaining film projectors is a tiny fraction of digital cinema...which, it would appear, needs a refresh every decade or so. Film projectors lasted what, 50-years? What is the cost/year on film (for an exhibitor)? What was the most expensive repair on a film projector (intermittent)...now compare that to a digital projector (light engine). The studios are why we switched. They didn't want to make film prints and I can certainly understand that. Exhibitors didn't sell one extra ticket for their continual investment in digital projection. We are now in a situation where exhibitors get to pay more.

              The value of standards (SMPTE) are well known for a great many industries and clearly the cinema industry too. That is why a film from 1920 can run on a projector for 2010. The whole DCI/SMPTE spec for digital cinema continued that along so that yes, if followed, your system should work. Then again, things do keep changing such that older servers don't do as well with newer content (particularly on subtitles/open captions) and even other new formats. Yes, DCP-O-Matic has been a godsend to MANY theatres (and I hope they are contributing)...heck the industry should send him a check every year for making it available.

              As for open-source...I'm not inherently against it. From an adoption rate, open source seems to be a winner for developers and industrial uses (no need to pay Apple or MS or put up with their shenanigans to make a Digital Cinema server or a VFD). But when looking at the world in total, open source hasn't proven to be a winner. Yes, all cinema servers use Linux...most people do not. I think Linux is still hovering around 5% for all operating systems.

              As for pushback from integrators...maybe in your part of the world but definitely not here (USA). I've never seen an integrator go for the the more-expensive locked-in system when a cheaper well supported system was available. And the key there is well supported. I've, personally, had more issues with Linux than the major OSes and I'm not a huge fan of them either. I like things that work. "It works" is the Film-Tech Forums feature I look for in all situations. Believe it or not, one of the reasons I'm so loyal to Eprad's eCNA automation...it is the most reliable thing in cinema I've ever encountered. The failure rate is practically zero. I think, since the eCNA came out, I might have had 1 CPU board have an issue. It has also transcended film equipment (the eCNA-200 did) and DCinema. But yeah...it is a closed system...it does what it does and it will only survive as a new product as long as a single company says so. Even Q-SYS, which uses LUA as its scripting language and does run in a Linux environment, really boils down to 1-company to support it (Q-SYS). It isn't like it would stop working if Q-SYS were to fold but it would cease to be developed (unless it was picked up by another entity). So, yeah, wouldn't object to having some form of communication standards that could make integration easier. When I see the hoops the A/V people go through controlling Samsung TVs because they have multiple internal standards, which they hide behind an NDA (like an API are big trade-secrets...so yeah, I'd prefer a standard API to that). We sort of have an open API thing with SNMP...but it is anything but "simple."

              I think the biggest pushback will be on manufacturers that want their own communication. Furthermore, within cinema, they're going to have to maintain their existing API, indefinitely, since all of the TMS systems are already using it. And, since they already have API, in use, why should they adopt another one too?

              I would say that self-maintenance is higher now than in film. Nobody is paying us to come out to clean filters. Those that changed their lamps with film, change their lamps with digital (and laser...that job is eliminated). Many things, particularly on Barco, can be user-changed, with minimal tools. While the tools for many film projectors were also simple/primitive, it didn't lend itself to the inept. One needed a bit of touch/feel plus a collection of test equipment to work on sound/aligning sound. Furthermore, the sound system, being light based, was something that degraded over time and needs the exciter lamp/LED changed, cleaned, tweaked/improved. Heck, today, I was checking over/adjusting some 35mm and 16mm optical soundheads/digital readers. I had an RTA, O-Scope, DMM gaggle of test films and the know-how to use them. For DCinema...I'd need...um nothing...possibly the ability to make an RJ45 connector. My toolbox for DCinema is often a laptop case with a VERY small collection of hand-tools (screwdrivers, torx drivers and some allen keys (metric). For film, I'd need a full complement of hand-tools.

              So, back on topic...as for HA...I have no problem with people promoting it and MQTT but I just don't see an interest (particularly right now for people wanting yet another means of communication. It appears that most things are trending towards an HTTP/WSDL/SOAP web based communication and again, if the TMSes are already using the existing API, there is less incentive to adopt another one.

              Comment


              • #37
                @Steve,

                I attended the immersive audio SMPTE meetings extensively when that initiative first began—until the trajectory became clear and, frankly, predictable. I’ve also been involved in most ISDCF meetings over the past 15 years. I tend to call things as I see them, having been in the room for much of this industry’s evolution.

                Regarding the film vs digital debate, the cost burden of film was, in truth, much higher when viewed holistically. While digital has its capital expenses, all costs—whether for film stock, duplication, or logistics—ultimately flowed back to the exhibitor via distributor fees or box office revenue splits. To suggest otherwise is to isolate the hardware cost and ignore the broader ecosystem. It's akin to saying DVDs were just as cost-effective as streaming, without factoring in distribution scale, accessibility, or efficiency.

                On open source: it won years ago. Every smartphone today runs an OS based on open source foundations—whether it’s iOS (based on BSD) or Android (Linux kernel). While desktop OS market share still leans toward proprietary systems, the infrastructure that powers the internet, cloud services, and embedded devices is almost entirely open-source. The cinema industry’s decision to adopt open technologies like Linux was strategic—specifically to avoid vendor lock-in and retain more industry control. Just imagine if Apple had designed cinema servers: 3x the cost, and every piece of content needing pre-approval. That’s not a future anyone in the industry wanted.

                As for integrators, my experience tells a different story than yours. For example, the Christie VPF imposed a non-standard network configuration—custom VLANs and unusual netmasks across every port—effectively ensuring that only the original integrator could service the system. Worse still, passwords to the switches were withheld from the cinemas who own them—arguably a breach of consumer law in my jurisdiction. These passwords are still withheld, even though the VPF program is long over and support from those entities has ceased. The practical result? Cinemas must rip out entire network infrastructures just to regain control—a costly and unnecessary barrier. So yes, from where I sit, many integrators act primarily in their own interest. Not all, perhaps, but enough to warrant caution.

                That’s the perspective I bring.

                Comment


                • #38
                  Originally posted by Bruce Cloutier View Post

                  Um... Somebody will get us back on topic. I hope.
                  I'd be interested in testing this with a JNIOR but am hesitant spending a lot on a new one to test with . This " INTEG JNIOR NETWORK I/O ETHERNET CONTROLLER RS-232 7-30 V AC/DC JNR-100-001A" is available on ebay at a doable price, but I've not used one before and cant seem to find anything about this model. Assuming it's old, but would it work for testing? Thoughts?

                  Comment


                  • #39
                    Originally posted by Dustin Grush View Post

                    I'd be interested in testing this with a JNIOR but am hesitant spending a lot on a new one to test with . This " INTEG JNIOR NETWORK I/O ETHERNET CONTROLLER RS-232 7-30 V AC/DC JNR-100-001A" is available on ebay at a doable price, but I've not used one before and cant seem to find anything about this model. Assuming it's old, but would it work for testing? Thoughts?

                    Yeah that is an older model. Bruce will certainly speak to it but I believe you could certainly experiment with it, however I think it won't support the latest JNIOR operating systems, so your development efforts might be less portable than desired. For those playing along, the one Dustin found was here: https://www.ebay.com/itm/204251853097

                    Comment


                    • #40
                      Originally posted by Dustin Grush View Post

                      I'd be interested in testing this with a JNIOR but am hesitant spending a lot on a new one to test with . This " INTEG JNIOR NETWORK I/O ETHERNET CONTROLLER RS-232 7-30 V AC/DC JNR-100-001A" is available on ebay at a doable price, but I've not used one before and cant seem to find anything about this model. Assuming it's old, but would it work for testing? Thoughts?

                      That is a Series 3 and we stopped manufacturing them in 2011. The Series 4 was a complete bottoms up redesign hardware and firmware. Aside from the Series 4 booting 200 times faster, the OS greatly simplified application development while significantly increasing capability. The point being that you can only do MQTT with Series 4 JNIOR. You can do a lot more with the Series 4 as on the applications side it is not a closed system (the Series 3 essentially was). It is used in a lot of other industries where companies have programmed their own applications and protocols. It is designed to be that SBC that can sit in the middle and do protocol conversions even those involving those "Neanderthal" GPIO interfaces.

                      We still support customers with Series 3 issues but no one should be acquiring them for new applications. To start the batteries (which in that series are not easily replaceable) are all dead at this point. That doesn't prevent the unit from operating. The Flash memory on many have reached its wear life. There is a risk now with Series 3 that it might just not boot one day. The firmware is stored in the same Flash as data.

                      That all said, we know that there are still 310s from 2005 in daily use.

                      There was a huge effort put into making the transition from Series 3 to Series 4 seamless. We needed you all who are familiar with Series 3 to not be confused by the Series 4. So they looked the same (only the color of the power LED and the background of the label changed on the package). We were able to recompile applications and keep those relatively the same.

                      There are surplus Series 3 available. They look to be the same but are not. Under the cover things have been completely redesigned. We have taken the Series 4 platform to new levels of capability over the past decade. We are committed to continue to manufacture it, develop for it, and support it for at least another decade.

                      Also, be aware that there have been Series 4 sold (maybe on eBay) where the internal boards have been swapped with a Series 3. Someone has upgraded their Series 3 by swapping the PCB boards and then recouping some of their cost by scamming a buyer into thinking they are getting new Series 4. That is the downside I guess of keeping the housings the same.

                      For that matter there are surplus JNIOR 1 and JNIOR 2 out there. You basically can't do anything with them unless you are good at reverse engineering the hardware. Those we cannot support. The JNIOR as you know it began with the Series 3. The earlier models were programmed for specific applications in PLC environments. Kodak did use some JNIOR 2 with their pre-show systems. They quickly had us create the Series 3 for them.

                      Since I am promoting... please note that we have not increased prices in the face of these stupid tariffs. We continue to provide our meager technical support for free. And applications (including software changes) are generally done without charge.

                      You know, to comment on prior discussion, there is a spectrum between "open" and "closed". Not everything falls at one end or the other.

                      Comment


                      • #41
                        Open source is an interesting subject (to me). I rent a virtual private server running Alma Linux with lots of stuff on there. Many web sites, email, and other stuff. One of my recent projects is at https://w6iwi.org/rtty/audio/npr/ and https://w6iwi.org/rtty/audio/csm/ .

                        As I recall the original IAB work by SMPTE, we were trying to take the best features of several systems (ATMOS, AURO, DTS-X, and more) to come up with an optimal system. However, it was taking a long time, and once we were done, no one would have a system that used the standard. Meanwhile, Dolby released a Registered Document Disclosure that revealed the bitstream for ATMOS. Dolby also provided a patent disclosure ensuring Reasonable And Non Discriminatory licensing of any patents required for implementaton of the bitstream (this did not include the rendering of the audio, but just interpretation of the bitstream). Since the standards committee was going nowhere fast, we received direction to build the standard around Dolby's RDD. This is what the committee did. As I recall, the committee added some bed channels at Barco's request. The rest of the work was on clarifying the document such that someone without inside knowledge could implement the bitstream. I suspect Dolby put a lot of money into development of the bitstream and, especially, the renderer that takes the bitstream and sends audio to the right places. I THINK they did not require a license to use the bitstream (which would have been permissible under RAND). Once the standard was established, suppliers were free to build their own renderer to interpret the bitstream or develop equipment to generate the bitstream.

                        The standard allows interoperability of equipment but the bitstream was still developed almost completely by one company, then made available to the industry.

                        Open source is an amazing concept, but people still have to make a living. I make monthly contributions to several open source projects that I use frequently.

                        Harold


                        Comment


                        • #42
                          @Harold — Yes, IAB was a mess.

                          Here’s my high-level take on what you’ve described: The key players didn’t want a proprietary system controlling the market. Dolby tried to get ahead by locking it all up through licensing. SMPTE stepped in and began discussions, but in my opinion, those talks were deliberately slowed—likely to give certain leaders time to surge ahead. The industry, however, held firm.

                          It dragged on for an astonishingly long time, even by SMPTE’s famously slow standards. Meanwhile, the incumbent had sunk major investments into technology that was effectively dying on the vine, as the major studios intentionally avoided moving forward with immersive audio altogether.

                          Eventually, that incumbent realized the only way to make money was to stop stalling and push the process to completion. Without immersive soundtracks—the real revenue driver—the domestic market would wither too. So they shifted from delay tactics to getting it done “yesterday.” In the end, competing interests driven by the dollar made it happen.

                          Dolby may not have secured a gatekeeper role in theatrical immersive audio, but they still managed to win—at least partially—in the domestic market. If the process had been delayed any longer, even that market might have collapsed. That said, I’m not convinced it’s been a great success; I don’t see Atmos domestic audio licensing on all that many products.

                          It also explains why Dolby’s speaker positioning—by no means optimal—was pushed onto us. The design was clearly based on “domestic direct compatibility,” with gimmicks like bouncing audio off the ceiling.

                          Comment


                          • #43
                            Originally posted by Harold Hallikainen View Post
                            Open source is an interesting subject (to me). I rent a virtual private server running Alma Linux with lots of stuff on there. Many web sites, email, and other stuff. One of my recent projects is at https://w6iwi.org/rtty/audio/npr/ and https://w6iwi.org/rtty/audio/csm/ .
                            The old school news wire project is nice. Well timed considering I just showed this sequence on the big screen:

                            Comment

                            Working...
                            X