Tamas Kerecsen
2006-09-06 10:42:41 UTC
Hi everyone,
I've noticed an odd behavior when I started experimenting with valuetypes. I
have a sequence that contains valuetype objects. When I set values in the
sequence (doing something like seq[0] = obj1), the reference count of the
objects are not incremented. However when the sequence goes out of scope, it
decrements the reference count of all the objects contained within. This is
asymmetrical.
All the rest of the C++ mapping seems to function intuitively (meaning that
you can use _var smart pointers for everything and never have to use
pointers, or call add_ref or remove_ref directly) except for this sequence
case, and the case of _downcast.
Is this behavior intentional? Is this a bug (improvement opportunity :)) in
the corba spec? Please don't hesitate to share any opinions/speculation!
Thanks,
Tamas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.omniorb-support.com/pipermail/omniorb-list/attachments/20060905/c49bdc96/attachment.htm
I've noticed an odd behavior when I started experimenting with valuetypes. I
have a sequence that contains valuetype objects. When I set values in the
sequence (doing something like seq[0] = obj1), the reference count of the
objects are not incremented. However when the sequence goes out of scope, it
decrements the reference count of all the objects contained within. This is
asymmetrical.
All the rest of the C++ mapping seems to function intuitively (meaning that
you can use _var smart pointers for everything and never have to use
pointers, or call add_ref or remove_ref directly) except for this sequence
case, and the case of _downcast.
Is this behavior intentional? Is this a bug (improvement opportunity :)) in
the corba spec? Please don't hesitate to share any opinions/speculation!
Thanks,
Tamas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.omniorb-support.com/pipermail/omniorb-list/attachments/20060905/c49bdc96/attachment.htm